Skip to content

Gesture control

Hint

The operating environment and software and hardware configurations are as follows:

  • OriginBot Pro
  • PC:Ubuntu (≥22.04) + ROS2 (≥humble)

Start the robot chassis

After the SSH connection to OriginBot is successful, enter the following command in the terminal to start the robot chassis:

ros2 launch originbot_bringup originbot.launch.py

image-20220822151626109

Enable the gesture control function

cd /userdata/dev_ws
export CAM_TYPE=usb
# Start the launch file
ros2 launch gesture_control gesture_control.launch.py

image-20220822151747385

Attention

When starting an application function, please make sure that the configuration file is in the current running path. Otherwise, the application function will not be able to find the configuration file and will fail to run.

Gestures control bot effects

After the startup is successful, stand in front of the OriginBot camera and use the following gestures to control the robot's movement.

Control Gesture Function Gesture Action Example
666 Gesture/Awesome Move forward image-awesome
yeah/Victory Move backward image-victory
Thumb Right Turn right image-thumbright
Thumb Left Turn left image-thumbleft

d7b39314d6

Visualized display of the upper computer

Open the browser and access the robot's IP address to see the real-time effect of visual recognition. image-20220922173203208

Introduction to the principle

The gesture control function is to control the movement of the robot car through gestures, including left and right rotation and front and back translation. It consists of MIPI image acquisition, human body detection and tracking, human hand key point detection, gesture recognition, gesture control strategy, image encoding and decoding, and WEB display terminal. The process is as follows:

20220922181402

For a detailed explanation of the principle, please see:

https://developer.d-robotics.cc/rdk_doc/Robot_development/apps/car_gesture_control

图片1