Skip to content

penn-figueroa-lab/markerless-human-perception

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Robust Filter for Marker-less Multi-person Tracking in Human-Robot Interaction Scenarios

Please check out our paper and project page.

Collection of ROS nodes for 3D marker-less human pose estimation from single RGB-D camera and marker-less wrist following.

Installation

git clone https://github.com/penn-figueroa-lab/markerless-human-perception.git
cd markerless-human-perception
source /opt/ros/noetic/setup.bash
catkin_make
source devel/setup.bash

Dependencies

Please follow the installation of the following packages:

Nodes description

Camera stream

Connect the realsense camera to the PC and run with the correct launchfile:

roslaunch realsense2_camera data/rs_d455_rmhri.launch

Human Pose Estimation

We developed two different ROS nodes for the HPE task:

Wrist tracker

This node reads the poses from the HPE backbone and publishes the target to follow at each frame.

Filter

This node is the implementation of our filter. It reads the poses from the HPE backbone and after filtering publishes the target to follow at each frame. When the filter is enables, it doesn't need the wrist tracker node.

Passive Velocity Controller

This nose sends the desired end-effector velocity to the passive controller. To enable passive controller, launch:

roslaunch franka_interactive_controllers franka_interactive_bringup.launch

Calibration

We prepared a set of scripts usefull for retrieving 4x4 RT matrices to transform a point from the franka coordinate system and the camera coordinate system to the Optitrack one.

The .dwg file of calibration board can be found here.

Optitrack calibration board (ground truth)

Turn on the OptiTrack device. Turn on the Motive app, load the calibration file. Then, run the modified version of natnet_ros_cpp:

roslaunch natnet_ros_cpp natnet_ros.launch

Camera calibration

This node records the RGB-D stream, along with the intrinsics matrix. Please note that the calibration board must be seen by both the camera and Optitrack. Then, this tool extract the 5 points of the calibration board from the camera recordings.

Franka calibration

This node collects the end-effector positions in both robot coordinate system and Optitrack. This can be used later to obtain the 4x4 RT matrix using the Kabsch-Umeyama algorithm .

Citation

If you are going to use some of these nodes for a scientific research, please cite the work below:

@inproceedings{Martini2024,
  title={A Robust Filter for Marker-less Multi-person Tracking in Human-Robot Interaction Scenarios},
  author={Martini, Enrico and Parekh, Harshil and Peng, Shaoting and Bombieri, Nicola and Figueroa, Nadia},
  booktitle={2024 33rd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)},
  pages={1-6},
  year={2024},
  organization={IEEE}
}

About

Collection of ROS nodes for 3D Markerless Human Pose Estimation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published