Develop Object Tracking Robot Using Arduino Hardware and Simulink
This example shows how to use Simulink® Support Package for Arduino® Hardware and an Arduino hardware board to develop object tracking robot.
Introduction
Object tracking is one of the various robotics applications that includes interacting with a colored object. Relatively easy to build, an object tracking robot can estimate the trajectory of a colored moving object with the help of a vision sensor, without any human intervention. Simulink Support Package for Arduino Hardware enables you to build a simple object tracking robot by interfacing Pixy2 vision sensor and motor drives with Arduino hardware board.
This example uses a two-wheeled robot built with Arduino hardware board. The robot tracks a colored moving object using a Pixy2 vision sensor. The robot self-aligns itself with the trained colored object while maintaining a threshold distance.
Prerequisites
Complete the Get Started with Arduino Hardware and Communicating with Arduino Hardware examples.
Required Hardware
To run this example, you will need the following hardware.
Arduino hardware board
DC Motor
Motor shields based on L293D or PCA9685 chip
Batteries
Pixy2 Vision Sensor
Chassis with wheels to set up the above hardware and make a robot
Hardware Setup
Sensor Setup
1. Build a two-wheeled robot with a Pixy2 vision sensor mounted on it.
2. Connect Pixy2 vision sensor to the Arduino hardware board through the supplied Arduino cable. If you do not have the cable, refer to Pin Connections to Pixy2 for customized connections.
3. Set up Pixy2 Vision Sensor through the PixyMon utility. PixyMon is an application that acts as an interface to help you view what the Pixy2 vision sensor sees as either unprocessed or processed video.
Configure I2C Address: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Interface. Configure the I2C address between 0x54 and 0x57.
Configure Sensor to Detect Objects: You can use the PixyMon utility to configure the sensor for seven unique colored objects. For more information on how to train the sensor for Color Signature 1, refer to Train Pixy2.
Configure Line Tracking Mode: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Expert. Check Delayed turn box to enable intersection detection.
DC Motor Setup
1. Use two motors to control the direction of the robot and connect them through motor shields based on chip such as L293D or PCA9685. Power the driver hardware with external power supply such as batteries and link the two motors to the two wheels of the chassis. You can refer to the motors as left and right depending on their connections to the robot.
2. Connect the enable pin, input A and input B of the motor shield to GPIO pins of the Arduino hardware board. The enable pin must be connected to one of the supported PWM pins. For more information on pin mapping, refer to Pin Mapping for Arduino Timer Dependent Blocks.
Simulink Model
This example uses a preconfigured Simulink model from the Simulink Support Package for Arduino Hardware.
Open the arduino_robotics_objecttracking
Simulink model.
The model is divided into three areas based on the functionality in each area.
Data Source
This block outputs the coordinates (X,Y), width, height and object count of the biggest object of the selected color signature. You can configure the following parameters in the data source block with Pixy2 Vision Sensor:
1. Select the I2C address parameter to the same I2C address as configured in PixyMon.
2. The default Tracking mode for Pixy2 vision sensor is Color Signature
. You can use other tracking options like Color Code Signature
, Color and Color Code Signature (Mixed)
, and Line
.
3. The default Color Signature is 1
. You can use ALL
to track all the color signatures.
4. Set the Sample time parameter, so that the block can read values from the Pixy2 Vision Sensor block based on this time.
Algorithm
This block contains Object Tracking subsystem that refers to PID for Alignment Control, PID for Proximity Control, Pixy2 X Frame Reference Value and Pixy2 Y Bottom Threshold Value in the Parameter Tuning area.
1. Use the PID for Alignment Control knobs and Pixy2 X Frame Reference Value slider in the Parameter Tuning area to adjust the alignment of the robot. The default value of Pixy2 X Frame Reference value is 158.
2. Use the PID for Proximity Control knobs and Pixy2 Y Bottom Threshold Value slider in the Parameter Tuning area to adjust the alignment of the robot. The default value of Pixy2 Y Bottom Threshold Value is 190.
Actuators
This subsystem receives the left and right motor power from the algorithm subsystem area, where the received motor power control the direction of motors, alignment with the path and proximity to the detected object.
Deploy Simulink Model on Arduino Hardware
Follow these steps to deploy the Simulink model.
1. Open arduino_robotics_objecttracking
Simulink model.
2. On the Modeling tab of the model, click Model Settings. In the Configuration Parameters dialog box, click Hardware Implementation on the left pane and set the Hardware board parameter to Arduino.
3. On the Hardware tab of the Simulink model, in the Mode section, select Run on board.
4. In the Deploy section of the Simulink model, click Build, Deploy & Start. The generated code is built on the Arduino hardware and runs automatically.
5. Observe the developed robot tracking the object by driving its motors from information acquired through the live Pixy2 vision sensor feed.