As part of my final year project in my undergraduate studies, I worked with two other students on finding a method to estimate the pose of a robot arm visually from a camera feed that is fixed away from the robot. Our project was supervised by Dr. Harsha S. Abeykoon. We used the ‘KUKA KR6 R900 Sixx’ robot arm model that was available in the Robotics and Automation Lab in our university for this study.
Previous researchers in this domain had utilized additional setups such as markers or depth cameras to realize the missing depth information. We came up with a novel approach that builds the robot arm in a 3d representation from only a 2D RGB image using the kinematics information of the robot arm and deep learning, and estimate the pose from that representation.
The findings of the research was compiled and submitted to the 3rd International Conference in Electrical Engineering (EECon 2021), where it was accepted for publication. Below is the video of the presentation I made for the conference.
https://saranganrajendran.com/wp-content/uploads/2021/12/V_1570749789.mp4
