Hostname: page-component-cd9895bd7-p9bg8 Total loading time: 0 Render date: 2024-12-23T06:21:59.600Z Has data issue: false hasContentIssue false

A Human-Guided Vision-Based Measurement System for Multi-Station Robotic Motion Platform Based on V-Rep

Published online by Cambridge University Press:  27 September 2019

Yabin Ding*
Affiliation:
Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300354, China. E-mails: [email protected], [email protected], [email protected]
Wei Guo
Affiliation:
Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300354, China. E-mails: [email protected], [email protected], [email protected]
Xianping Liu
Affiliation:
School of Engineering, University of Warwick, Coventry, UK. E-mail: [email protected]
Zhenjun Luo
Affiliation:
Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, Tianjin, 300354, China. E-mails: [email protected], [email protected], [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

In the manufacturing process of sophisticated and individualized large components, classical solutions to build large machine tools cannot meet the demand. A hybrid robot, which is made up of a 3 degree-of-freedom (3-DOF) parallel manipulator and a 2-DOF serial manipulator, has been developed as a plug-and-play robotized module that can be rapidly located in multi-stations where machining operations can be performed in situ. However, processing towards high absolute accuracy has become a huge challenge due to the movement of robot platform. In this paper, a human-guided vision system is proposed and integrated in the robot system to improve the accuracy of the end-effector of a robot. A handheld manipulator is utilized as a tool for human–robot interaction in the large-scale unstructured circumstances without intelligence. With 6-DOF, humans are able to manipulate the robot (end-effector) so as to guide the camera to see target markers mounted on the machining datum. Simulation is operated on the virtual control platform V-Rep, showing a high robust and real-time performance on mapping human manipulation to the end-effector of robot. And then, a vision-based pose estimation method on a target marker is proposed to define the position and orientation of machining datum, and a compensation method is applied to reduce pose errors on the entire machining trajectory. The algorithms are tested on V-Rep, and the results show that the absolute pose error reduces greatly with the proposed methods, and the system is immune to the motion deviation of the robot platform.

Type
Articles
Copyright
© Cambridge University Press 2019

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Summers, M., “Robot Capability Test and Development of Industrial Robot Positioning System for the Aerospace Industry,” SAE Aerospace Manufacturing and Automated Fastening Conference, Texas, USA, Paper No. 01-3336 (2005).CrossRefGoogle Scholar
Neumann, K. E., “The Key to Aerospace Automation,” SAE Aerospace Manufacturing and Automated Fastening Conference and Exhibition, Detroit, USA, Paper No. 01-3144 (2006).CrossRefGoogle Scholar
Neumann, K. E., “Tricept Application,” Proceedings of the 3rd Chemnitz Parallel Kinematics Seminar, Zwickau, Germany (2002) pp. 547551.Google Scholar
Huang, T., Dong, C., Liu, H., Qin, X., Mei, J., Liu, Q. and Wang, M., “Five-degree-of-freedom hybrid robot with rotational supports,” United States Patent, No.: US 9,943,967 B2 (2018).Google Scholar
Frommknecht, A., Kuehnle, J., Effenberger, I. and Sergej, P., “Multi-sensor measurement system for robotic drilling,Rob. Comput.-Integr. Manuf. 47, 410 (2017).CrossRefGoogle Scholar
Galetto, M., “Advances in large-scale metrology – Review and future trends,CIRP Ann. Manuf. Technol. 65(2), 643665 (2016).Google Scholar
Vincze, M., Prenninger, J. P. and Gander, H., “A laser tracking system to measure position and orientation of robot end effectors under motion,Int. J. Rob. Res. 13(4), 305314 (1994).CrossRefGoogle Scholar
Flynn, R., “Synthesizing metrology technologies to reduce engineering time for large CNC machine compensation,SAE Int. J. Mater. Manuf. 5(1), 4959 (2011).CrossRefGoogle Scholar
Estler, W. T., Edmundson, K. L., Peggs, G. N. and Parker, D. H., “Large-scale metrology – An update,CIRP Ann. Manuf. Technol. 51(2), 587609 (2002).CrossRefGoogle Scholar
Zhang, X., Song, Y., Yang, Y. and Pan, H., “Stereo vision based autonomous robot calibration,Rob. Auton. Syst. 93, 4351 (2017).CrossRefGoogle Scholar
Roth, Z. S., Zhuang, H. and Wang, K., “Simultaneous calibration of a robot and a hand-mounted camera,T-RA 11(5), 649660 (1995).Google Scholar
Flandin, G., Chaumette, F. and Marchand, E., “Eye-in-Hand/Eye-to-Hand Cooperation for Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation, San Francisco, CA, USA (2000) pp. 27412746.Google Scholar
Li, X. and Cheah, C. C., “Human-Guided Robotic Manipulation: Theory and Experiments,” Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China (2014) pp. 45944599.Google Scholar
Li, X., Chi, G., Vidas, S. and Cheah, C. C., “Human-guided robotic comanipulation: Two illustrative scenarios,IEEE Trans. Control Syst. Technol. 24(5), 17511763 (2016).CrossRefGoogle Scholar
Krüger, J., Lien, T. K. and Verl, A., “Cooperation of human and machines in assembly lines,CIRP Ann. Manuf. Technol. 58(2), 628646 (2009).CrossRefGoogle Scholar
Morioka, M. and Sakakibara, S., “A new cell production assembly system with human-robot cooperation,CIRP Ann. Manuf. Technol. 59(1), 912 (2010).CrossRefGoogle Scholar
Zwicker, C. and Reinhart, G., “Human-Robot-Collaboration System for a Universal Packaging Cell for Heavy Electronic Consumer Goods,” In: Enabling Manufacturing Competitiveness and Economic Sustainability (Springer, Cham, Switzerland, 2014).Google Scholar
Fischer, M. and Henrich, D., “3D Collision Detection for Industrial Robots and Unknown Obstacles Using Multiple Depth Images,” In: Advances in Robotics Research (Springer, Berlin, Heidelberg, 2009).Google Scholar
Pedrocchi, N., Vicentini, F., Malosio, M. and Tosatti, L. M., “Safe human-robot cooperation in an industrial environment,Int. J. Adv. Rob. Syst. 10(1972), 113 (2013).Google Scholar
Schmidt, B. and Wang, L., “Active Collision Avoidance for Human-Robot Collaborative Manufacturing,” Proceedings of the 5th International Swedish Production Symposium, Linköping, Sweden (2012) pp. 8186.Google Scholar
Salmi, T., Väätäinen, O., Malm, T., Montonen, J. and Marstio, I., “Meeting New Challenges and Possibilities with Modern Robot Safety Technologies,” In: Enabling Manufacturing Competitiveness and Economic Sustainability (Springer, Cham, Switzerland, 2014).Google Scholar
Hu, X. and Zeigler, B. P., “A simulation-based virtual environment to study cooperative robotic systems,Integr. Comput. Aided Eng. 12(4), 353367 (2005).CrossRefGoogle Scholar
Mourtzis, D., Papakostas, N., Mavrikios, D., Makris, S. and Alexopoulos, K., “The role of simulation in digital manufacturing: Applications and outlook,Int. J. Comput. Integr. Manuf. 28(1), 324 (2015).CrossRefGoogle Scholar
Mourtzis, D., Zogopoulos, V. and Vlachou, E., “Augmented reality application to support remote maintenance as a service in the robotics industry,Procedia CIRP. 63, 4651 (2017).CrossRefGoogle Scholar
Busch, F., Wischniewski, S. and Deuse, J., “Application of a Character Animation SDK to Design Ergonomic Human-Robot-Collaboration,” Proceedings of the 2nd International Digital Human Modeling Symposium, Ann Arbor, MI (2013) pp. 17.Google Scholar
Ore, F., Hanson, L., Delfs, N. and Wiktorsson, M., “Virtual Evaluation and Optimisation of Industrial Human-Robot Cooperation: An Automotive Case Study,” Proceedings of the 2nd International Digital Human Modeling Symposium, Tokyo, Japan (2014) pp. 18.Google Scholar
Ore, F., Hanson, L., Delfs, N. and Wiktorsson, M., “Human industrial robot collaboration – Development and application of simulation software,Int. J. Hum. Factors Modell. Simul. 5(2), 164185 (2015).CrossRefGoogle Scholar
Freese, M., Singh, S., Ozaki, F. and Matsuhira, N., “Virtual Robot Experimentation Platform V-REP: A Versatile 3D Robot Simulator,” Proceedings of the International Conference on Simulation, Modeling, and Programming for Autonomous Robots, Darmstadt, Germany (2010) pp. 5162.Google Scholar
Rohmer, E., Singh, S. P. N. and Freese, M., “V-REP: A Versatile and Scalable Robot Simulation Framework,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan (2013) pp. 13211326.Google Scholar
Koenig, N. and Howard, A., “Design and Use Paradigms for Gazebo, an Open-Source Multi-Robot Simulator,” Proceedings of the IEEE/RJS International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan (2004) pp. 21492154.Google Scholar
Michel, O., “Cyberbotics ltd. webotsTM: Professional mobile robot simulation,” Int. J. Adv. Rob. Syst. 1(1), 3942 (2004).Google Scholar
3Dconnextion, “3Dconnexion: The World’s First Wireless 3D Mouse,” http://www.3dconnexion.com/products/spacemouse/spacemousewireless.html. accessed 14 June 2018.Google Scholar
Simple DirectMedia Layer, “About SDL,” https://wiki.libsdl.org/FrontPage, accessed 14 June 2018.Google Scholar
Tsai, R. and Lenz, R., “A new technique for fully autonomous and efficient 3D robotics hand/eye calibration,IEEE Trans. Rob. Autom. 5(3), 345358 (1989).CrossRefGoogle Scholar
Grycuk, R., Gabryel, M., Korytkowski, M., Scherer, R. and Voloshynovskiy, S., “From Single Image to List of Objects Based on Edge and Blob Detection,” International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland (2014) pp. 605615.CrossRefGoogle Scholar