ISSN 0021-3454 (print version)
ISSN 2500-0381 (online version)
Menu

6
Issue
vol 67 / May, 2024
Article

DOI 10.17586/0021-3454-2024-67-6-500-510

UDC 007.51

TRAINING BEHAVIOR PRIORS MODELS FOR PROGRAMMING ROBOTIC CONTACT-RICH MANIPULATION

A. Waddah
ITMO University, Faculty of Control Systems and Robotics, International Laboratory of Biomechatronics and Energy-Efficient Robotics; engineer


S. A. Kolyubin
ITMO University, Saint Petersburg, 197101, Russian Federation; Associate Professor

Reference for citation: Waddah Ali, Kolyubin S. A. Training Behavior Priors Models for Programming Robotic Contact-Rich Manipulation. Journal of Instrument Engineering. 2024. Vol. 67, N 6. P. 500–510 (in Russian). DOI: 10.17586/0021- 3454-2024-67-6-500-510

Abstract. Learning from demonstration approach is gaining interest for programming robot sensory-motor skills. At the same time, most of the works are addressing manipulation scenarios with position-based control, while various application domains and work in dynamic environment require safe and stable physical interaction where assessing proper force/torque profile along motion is crucial. This study is aimed at developing experiment planning and data collection and processing procedure for training robot behavior priors for dynamic interaction tasks. We fuse motion capture and force-torque sensory data within robot-out-of-loop setting to train Gaussian Mixture Model/Gaussian Mixture Regression (GMM/GMR) model as a reference motion generator that takes time and material label as inputs and outputs predicted end-effector’s pose, twist, and interaction wrench vectors. For the case-study we considered experiment setting of cutting three different materials like penoplex, cork, and PVC resulting in 120 demonstrations in total (40 for each material). Algorithms for data processing, GMM/GMR model training and verification have been introduced. We achieved RMSEs of 7.12 and 10.69 % for twist and pose predictions respectively and RMSE of 14.33 % for power estimates as a metric to illustrate how accurate twistwrench correspondences have been captured by our model, which is important for interaction tasks.
Keywords: learning from demonstration, robot skill transfer, contact manipulation, interaction dynamics, motion capture

References:
  1. Noémie J., Welle M. C., Gams A., Yao K., Fichera B., Billard A., Ude A., Asfour T., and Kragić D. arXiv preprint arXiv:2311.18044, 2023.
  2. Jauhri S., Peters J., and Chalvatzaki G. IEEE Robotics and Automation Letters, 2022, no. 3(7), pp. 8399–8406.
  3. Liu P., Zhang K., Tateo D., Jauhri S., Hu Zh., Peters J., and Chalvatzaki G. 2023 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2023, pр. 9449–9456.
  4. Shek A., Su B. Y., Chen R, and Liu Ch. 2023 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2023, pр. 9910–9916.
  5. Zhao J., Giammarino A., Lamon E., Gandarias J., De Momi E., and Ajoudani A. IEEE Robotics and Automation Letters, 2022, no. (), pp.7:1–8, 07.
  6. Calinon S. Intelligent service robotics, 2016, no. 9, pp. 1–29.
  7. Fabisch A. Journal of Open Source Software, 2021, no. 6 (62), pp. 3054.
  8. Sung H. G. Gaussian mixture regression and classification, Rice University, 2004.