Solving the robot pose is a fundamental requirement for vision-based robot control, and is a process that takes considerable effort and care to make accurate. Traditional approaches require modification of the robot via markers which make them unsuitable in most unstructured environments. Our goal is to investigate the possibilities of tracking robots in the wild, in unforseen scenarios, to pursue autonomous robot-camera calibration, dynamic visual servoing, and even transfer learning. In this project, we utilize advanced computer vision and robot state estimation techniques for estimating the robot's pose in dynamic environments. We focus on tracking the pose of various robots, including robot manipulators, surgical robots, and snake robots, and consider how foundation models may both leverage and be leveraged by these techniques.
arXiv preprint arXiv:2409.19490 (2024)
Soofiyan Atar, Yuheng Zhi, Florian Richter, Michael Yip
arXiv preprint arXiv:2409.10441 (2024)
Jingpei Lu, Zekai Liang, Tristin Xie, Florian Ritcher, Shan Lin, Sainan Liu, Michael C Yip
IEEE Conference on Robotics and Automation (ICRA) (2024)
Fangbo Qin, Taogang Hou, Shan Lin, Kaiyuan Wang, Michael C Yip, Shan Yu
IEEE International Conference on Robotics and Automation (ICRA) (2023)
Zih-Yun Chiu, Florian Richter, Michael C Yip
(BEST PAPER AWARD)
IEEE International Conference on Robotics and Automation (ICRA) (2023)
Jingpei Lu, Fei Liu, Cedric Girerd, Michael C Yip
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Jingpei Lu, Florian Richter, Michael C Yip
IEEE Robotics and Automation Letters (2022)
Jingpei Lu, Florian Richter, Michael C Yip
IEEE Transactions on Robotics (2021)
Florian Richter, Jingpei Lu, Ryan K Orosco, Michael C Yip