Hostname: page-component-78c5997874-94fs2 Total loading time: 0 Render date: 2024-11-17T17:09:46.533Z Has data issue: false hasContentIssue false

Misalignment estimation and compensation for robotic assembly with uncertainty

Published online by Cambridge University Press:  27 April 2005

W. S. Kim
Affiliation:
Bioptro Co. LTD., Technopark, #D-801 151, Yatap-dong, Bundang-gu, Sungnam City, Kyungki-do 463-070 (South Korea) E-mail: [email protected]
H. S. Cho
Affiliation:
Department of Mechanical Engineering, Korea Advanced Institute of Science and Technology 373-1, Kusong-dong, Yusong-gu, Daejon (South Korea) E-mail: [email protected]

Abstract

The complexity and uncertainty of the cross-sectional shape of the parts to be mated is one of the main reasons that misalignment between them occurs in assembly processes. Misalignment cannot only give rise to assembly failure but also cause damage to the parts or a robot due to large contact force. Therefore, misalignment sensing and compensation is essential for successful assembly operation. In this paper, we propose a novel misalignment estimation and compensation method which does not need any advance information on the cross-sectional shapes of the mating parts. This method utilizes a $\varphi-r$ transformation and an M-estimation pattern matching technique with misalignment images of a peg and a hole taken by an omni-directional visual sensing system during assembly. At every sampling instant during assembly action, it furnishes information on the relative position and orientation between the mating parts, and thus helps to estimate and compensate any possible misalignment between them. Also, a series of experiments are performed with a couple of peg-in-hole tasks, and the results are discussed. The experimental results show that the proposed method is effective for misalignment compensation in robotic assembly even though there is no prior information on part geometry and the images are very noisy.

Type
Research Article
Copyright
© 2005 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)