-
Notifications
You must be signed in to change notification settings - Fork 54
feat: autonomous person tracking with YOLO11n computer vision and IMU fusion #115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: autonomous person tracking with YOLO11n computer vision and IMU fusion #115
Conversation
|
The following cherry-picked commit transitions the tracking package from publishing Twist messages to publishing Command messages directly: [001474d]– refactor: remove Twist dependency and implement direct Command publishing |
| @@ -1,6 +1,6 @@ | |||
| sensors: | |||
| lidar: true | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can check with Afreez, what people usually buy the mini_pupper_2 with?
set default values that most people can just use it without editing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lidar and imu have now been set to true, and camera set to false as they are in the ros2-dev branch. user has been advised to enable the camera if they wish to use the tracking package
| 'zeta': 0.0, | ||
| }], | ||
| remappings=[ | ||
| ('imu/data_raw', 'imu/data'), # Input: read from existing /imu/data |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I usually get confused at this part,
so for input you are converting incoming 'imu/data_raw' topic into 'imu/data' topic which imu_filter_madgwick_node can consume?
when output, imu_filter_madgwick_node outputs on the topic imu/data, and you remap to imu/qdata?
if possible, please select specific message names like /imu/filtered_with_madgwick because navigation packges usually use the very generic /imu/data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
topic name has been changed from imu/qdata to imu/data_filtered_madgwick to make the name more specific for clarity
| self.cmdpub.publish(cmd) | ||
|
|
||
| def create_command(self, vel=[0.0,0.0], ang=0.0, pitch=0.0): | ||
| cmd = Command() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if you have thought about sending out cmd_vel: Twist, and running it together with twist_to_command_node?
I feel like Command is very complex, but cmd_vel: Twist shall be the common communication topic with application packages like line following, tracking.
twist_to_command_node is an important converter for the rest.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Command used as opposed to Twist to allow pitch tracking functionality
44d9d56
into
mangdangroboticsclub:ros2-dev
Overview
This PR introduces a complete person tracking system for Mini Pupper, developed during the 2025 Global Internship Programme at HKSTP.
The robot uses YOLO11n to detect people and automatically rotates to keep them centred in its field of view. The system includes detection, UUID-based tracking, PID motion control with IMU feedback, and two visualisation methods: a Flask-based web interface and RViz rendering.
Key Additions
New Package:
mini_pupper_trackingtracking_node.py: runs YOLO11n +motpytracking, publishesTrackingArraymovement_node.py: performs PID yaw/pitch control using IMU + tracking datacamera_visualisation_node.py: renders 3D FOV cone and detection points in RVizflask_server.py: hosts live video stream with bounding boxes and UUIDsNew Message Types (in
mini_pupper_interfaces)Tracking,TrackingArrayNew Launch File
tracking.launch.py: brings up the full tracking + control stackRViz Visualisation Support (in
mini_pupper_description)stanford_visualisation.launch.py: launches robot model and visualisation nodesstanford_viewer.rviz: preconfigured RViz view for trackingstanford_joint_trajectory_to_states.py: translates joint trajectory output into joint statesstanford_state_publisher.py: publishes joint states for visualisationTechnical Highlights
Notes