1. Introduction
Prestressed centrifugal concrete piles are manufactured by modern technology compared with traditional concrete piles. This type of concrete pile is used popularly in building construction. Thus, the increasing productivity and the quality of the piles are very important. To do this, an automatic process of manufacturing the piles is necessary. In this paper, the X-shaped pile tip is considered in the automatic welding procedure. Gas metal arc welding (GMAW) is used welding method in this research. One industrial manipulator is used to move the welding gun to make the welding process.
One big problem for the automatic welding process is how to determine the trajectory of the welding seam. Because the dimensions and the shape of the X-shaped pile tip are not the same for all welding parts, it cannot use the same position and orientation values for all parts. In the case of manual welding process, the welder can adjust the errors. In the case of automatic welding process, the sensors are used to determine the trajectory of the welding seam.
There are many types of sensors that can be used to detect the welding seam paths. Each type of sensor has its own advantages and disadvantages. Rout et al. mentioned the used type of sensors for detecting the welding seam such as arc sensor, vision sensor, laser vision sensor, ultrasonic sensor, electromagnetic sensor, infrared sensor, tactile sensor, and touch sensor [Reference Rout, Deepak and Biswal1]. Mahajan et al. presented a novel approach for seam tracking using ultrasonic sensor [Reference Mahajan and Figueroa2]. An ultrasonic seam tracking system has been developed for robotic welding which tracks a seam that curves freely on a two-dimensional surface. Among these other sensors, there are three types of sensors that can be used the most for detecting the welding seam in GMAW because of high precision. They are arc sensor, vision sensor, and laser vision sensor.
The arc sensor is typically designed for metal arc welding process and depends on a detecting technique which uses a key arc property, that is, arc voltage and current features change according to the height of torch comprising of arc length and wire extension changes [Reference Rout, Deepak and Biswal1]. Ushio et al. developed a non-linear model to describe the relationship between the output (welding current and voltage) and input (torch height) of a through-the-arc sensor (arc sensor) for DC MIG/MAG welding in open arc mode [Reference Ushio and Mao3]. Jiluan described a mathematical model of the static and the dynamic properties of the sensor derived based on control theory and experimental method [Reference Jiluan4]. Moon et al. introduced a new piece of automatic welding equipment used in the dual tandem welding process for pipeline construction [Reference Moon, Ko and Kim5]. The arc sensor was also developed for a narrow welding groove to achieve higher accuracy of seam tracking and fully automatic operation. Fridenfalk et al. presented the design and validation of a novel and universal 6D seam tracking system that reduces the need for accurate robot trajectory programming and geometrical databases in robotic arc welding [Reference Fridenfalk and Bolmsjö6]. However, the arc sensor has some disadvantages such as seam tracking operation has been done only in real time by finding only the deviations in torch position.
A vision sensor can be utilized to recognize and find the position of welding creases which can be used to arrange a way to weld the parts naturally [Reference Rout, Deepak and Biswal1]. Ali et al. introduced a new supervised learning technique for programming a 4-degree of freedom (DOF) welding arm robot with an automatic feeding electrode [Reference Ali and Atia7]. Xu et al. proposed a visual control system to locate the start welding position and track the narrow butt-welding seam in container manufacturing [Reference Xu, Fang, Chen, Yan and Tan8]. Nele et al. presented an automatic seam tracking system, where the automatic tracking of welding path and torch positioning is performed by a newly developed image acquisition system [Reference Nele, Sarno and Keshari9]. Xu et al. presented a method for real-time image capturing and processing for the application of robotic seam tracking [Reference Xu, Fang, Chen, Zou and Ye10]. By analyzing the characteristics of robotic GMAW, the real-time weld images are captured clearly by the passive vision sensor. Xu et al. introduced the research in the application of computer vision technology for real-time seam tracking in robotic gas tungsten arc welding and GMAW [Reference Xu, Fang, Lv, Chen and Zou11]. The key aspect in using vision techniques to track welding seams is to acquire clear real-time weld images and to process them accurately. Dinham et al. introduced an autonomous robotic arc welding system that can detect realistic weld joints and calculate their position in the robot workspace with minimal human interaction [Reference Dinham and Fang12]. The proposed method is capable of detecting and localizing butt and fillet weld joints regardless of base material, surface finish, or imperfections. Lin et al. proposed a hybrid CNN and adaptive ROI operation algorithm for intelligent seam tracking of an ultranarrow gap during K-TIG welding [Reference Lin, Shi, Wang, Li and Chen13]. Fei et al. suggested using machine vision method to analyze various welding processes, to know how much machine vision technology will improve the efficiency of the welding industry [Reference Fei, Tan and Yuan14]. Liang et al. proposed a weld track identification method based on illumination correction and center point extraction to extract welds with different shapes and non-uniform illumination [Reference Liang, Wu, Hu, Bu, Liang, Feng and Ma15]. Sawano et al. introduced a new robot system which was designed to seal the seams of car body panels using a solid-state camera [Reference Sawano, Ikeda, Utsumi, Ohtani, Kikuchi, Ito and Kiba16].
The vision sensor using only CCD camera for seam tracking becomes very complicated as the algorithms proposed for extracting the weld seam position from the two-dimensional image plane are not so accurate and time-consuming. Again, the measurement of weld groove depth, weld gap, and detail shape condition are quite difficult. Therefore, many researchers have investigated laser vision sensors with structured light emitters that is, laser diode to come with a solution for the above-stated problems [Reference Rout, Deepak and Biswal1]. He et al. presented a method of autonomously detecting weld seam profiles from molten pool background in metal active gas (MAG) arc welding using a novel model of saliency-based visual attention and a vision sensor based on structured light [Reference He, Chen, Xu, Huang and Chen17]. Gu et al. proposed an automatic welding tracking system of arc welding robot for multi-pass welding [Reference Gu, Xiong and Wan18]. The developed system includes an image acquisition module, an image processing module, a tracking control unit, and their software interfaces. Huang et al. introduced a laser welding experimental platform for narrow and burred seam welding (with seam width less than 0.1 mm) of complex three-dimensional (3D) surface based on eight-axis machine tool [Reference Huang, Xiao, Wang and Li19]. Wu et al. presented a method to remove the noise due to the complexity of welding environment using noise filters of the welding seam image captured from the CCD camera [Reference Wu, Lee, Park, Park and Kim20]. He et al. presented a scheme for the extracting feature points of the weld seam profile to implement automatic multi-pass route planning, and guidance of the initial welding position in each layer during MAG arc welding using a vision sensor based on structured light [Reference He, Xu, Chen, Chen and Chen21]. Ma et al. proposed an efficient and robust complex weld seam feature point extraction method based on a deep neural network (Shuffle-YOLO) for seam tracking and posture adjustment [Reference Ma, Fan, Yang, Wang, Xing, Jing and Tan22]. The Shuffle-YOLO model can accurately extract the feature points of butt joints, lap joints, and irregular joints, and the model can also work well despite strong arc radiation and spatters. Ma et al. suggested a fast and robust seam tracking method for spatial circular welding based on laser visual sensors [Reference Ma, Fan, Yang, Yang, Ji, Jing and Tan23]. Fan et al. presented a seam feature point acquisition method based on efficient convolution operator and particle filter, which could be applied to different weld types and could achieve fast and accurate seam feature point acquisition even under the interference of welding arc light and spatter noises [Reference Fan, Deng, Ma, Zhou, Jing and Tan24]. Fan et al. proposed an initial point alignment and seam tracking system for narrow weld [Reference Fan, Deng, Jing, Zhou, Yang, Long and Tan25]. Changyong et al. developed a laser welding seam tracking sensing technology based on swing mirrors [Reference Changyong, Xuhao, Tie and Yi26]. Xiao et al. suggested a laser stripe feature point tracker (LSFP tracker) based on Siamese network for robotic welding seam tracking [Reference Xiao, Xu, Xu, Hou, Zhang and Chen27]. Xu et al. proposed an automatic weld seam tracking method based on laser vision and machine learning [Reference Xu28]. Chen et al. presented an economical 3D measurement sensor based on structured light technique [Reference Chen, Okeke and Zhang29]. Tian et al. designed and implemented a machine vision recognition algorithm system for feature points on the surface of oil pipeline welds [Reference Tian, Zhou, Yin and Zhang30]. Huissoon presented the design of a calibration system with which these frames may be precisely defined with respect to each other [Reference Huissoon31]. This calibration can be difficult to perform since the sensor and laser frames are virtual in the sense that these are in space with respect to the physical hardware, and the wrist frame of the robot is often not physically accessible. Liu et al. designed a structured light stereo vision system based on line laser sensor to realize real-time collection of images of weld lines [Reference Liu and Wang32]. Zou et al. proposed a two-stage seam tracking method named Heatmap method combining welding image inpainting and feature point locating from a heatmap [Reference Zou, Wei, Chen, Zhu and Zhou33]. Yang et al. presented a novel image-denoising method of seam images for automatic laser stripe extraction to serve intelligent robot welding applications, such as seam tracking, seam type detection, weld bead detection, etc. [Reference Yang, Fan, Huo, Li and Liu34]. Dong et al. designed and developed a seam tracking system for human-computer interactive mobile robots [Reference Dong, Qin, Fu, Xu, Wang, Wu and Wang35]. The spatial coordinates were obtained by camera calibration, line laser calibration, and hand-eye calibration. Ma et al. developed a hybrid vision sensor that integrates monocular vision, laser structured light vision, and coded structured light vision simultaneously [Reference Ma, Fan, Zhou, Zhao, Jing and Tan36].
In almost all the above papers, the butt joint and the V-groove joint are considered. In this research, the fillet welding joint is considered because of the shape of the X-shaped tip. There are a few research concentrating on the fillet joint. Chen et al. represented a method to recognize whole seams in complex robotic welding environment [Reference Chen, Chen and Lin37]. According to the welding parameter, the authors defined to judge the degree of image noise, then different size of filter windows were selected to preprocess the image. Dinham et al. presented a method for the automatic identification and location of welding seams for robotic welding using computer vision [Reference Dinham and Fang38]. The methods developed in the paper enable the robust identification of narrow weld seams for ferrous materials combined with reliable image matching and triangulation using 2D homograph. Quan et al. proposed a visible and intelligent method to monitor the fillet welding of corrugated sheets and the equipment [Reference Quan and Bi39]. The results show that the authors can improve the traditional welding methods that cannot meet the high-quality and efficient welding requirements of the corrugated sheets and other large workpieces. Takubo et al. proposed a method for welding line detection using point cloud data to automate welding operations combined with a contact sensor [Reference Takubo, Miyake, Ueno and Kubo40]. The proposed system targets a fillet weld, in which the joint line between two metal plates attached vertically is welded. Dinham presented an autonomous weld joint detection and localization method using computer vision in robotic arc welding including fillet joints [Reference Dinham41]. Zeng et al. proposed a weld joint type identification method for visual sensor based on image features and SVM [Reference Zeng, Cao, Peng and Huang42]. Fridenfalk et al. designed and validated a sensor-guided control system for robot welding in shipbuilding [Reference Fridenfalk and Bolmsjo43]. Huang et al. proposed a Gaussian-weighted PCA-based laser center line extraction method to identify feature points in fillet joints [Reference Huang, Xu, Gao, Wei, Zhang and Li44]. Le et al. proposed a method for realization of rectangular fillet weld tracking based on rotating arc sensors [Reference Le, Zhang and Chen45]. Cibicik et al. presented a novel procedure for robotic scanning of weld grooves in large tubular T-joints [Reference Cibicik, Njaastad, Tingelstad and Egeland46]. Fang et al. proposed vision-based modeling and control for filler weld seam tracking [Reference Fang and Xu47].
This paper suggests a new algorithm with a new method to detect the welding seam position of fillet joint using laser distance sensor. The use of laser distance sensors is cheaper and simpler compared to using laser vision sensors and using only vision sensors. The authors have not seen any research using laser distance sensors to track the weld seam trajectory of fillet joint. The first section presents the introduction of the paper. Section 2 describes the proposed solution of the paper. The third section shows the proposed system configuration of the research. Section 4 shows the seam tracking algorithm. The experimental results are shown in se ction 5.
2. Problem statement and proposed solution
This paper proposes two contributions for automatic welding process of X-shaped tip of the prestressed centrifugal concrete pile. The first one is designing and manufacturing the positioning table for automatic welding process of the concrete pile tip. The second one is building the automatic weld seam tracking algorithm for automatic welding process of X-shaped tip using laser distance sensor.
2.1. Problem statement
The X-shaped tip of the prestressed centrifugal concrete piles is shown in Fig. 1. This tip is made from 4 parts including one square base part, one large triangle part, and 2 small triangle parts. In this research, the dimensions of the square base part are 150 × 150 × 6 (mm), ones of the large triangle part are 212 × 66 × 6, and ones of the small triangle parts are 103 × 64 × 6 (mm). The material of the tip is carbon steel.
To manufacture the X-shaped concrete tip, it needs to make the welding paths including 8 horizontal lines in the 2F position and 4 vertical lines in the 3F position. The manufacturing of the X-shaped tip of prestressed centrifugal concrete piles is nowadays done half automatically by combining the manual worker and the automatic welding robot. The horizontal lines are welded automatically by industrial robot using touch sensors. After that, the vertical lines are welded manually. Thus, the quality of the product is not good and is not the same for all products.
2.2. Proposed solution
In this research, the authors proposed the fully automatic process for manufacturing the X-shaped tip. To solve this problem, a positioning table is used for supporting the welding process of the industrial manipulator. All necessary welding lines can be divided into 4 zones including zone I, zone II, zone III, and zone IV. In each zone, there are 2 horizontal lines and 1 vertical line. Fig. 1 also shows the zones and number of the welding lines. The position of each welding zone can be obtained by rotating the positioning table 90 degrees. Fig. 2 shows the idea of using the industrial robot and the positioning table to automatically weld the X-shaped tip.
The use of motion coordination between the robot and the positioning table will help expand the working area for the robot as well as give the programmer more options in programming the welding trajectories. For welding positions further away from the base of the robot, for example the positions located near the limits of the robot’s workspace, it is more difficult to program because it has fewer the configurations of the robot that satisfy the welding requirements. Using the positioning table system will move the welding lines to the working area convenient for robot programming. The proposed positioning table can be shown in Fig. 3.
In this research, the laser distance sensor was chosen because it has several advantages compared to the laser stripe sensor or laser vision sensor. Although the laser vision sensor is the most popular sensor used in welding seam tracking today, they still have several weak points. The cost for the laser vision sensor system is expensive and the control algorithm for this system is so complicated. This sensor is often attached to the robot to track the welding seam of a static part. One of the problems of the laser vision sensor is that this method is not applicable for narrow welding seams. The deformation of the laser stripe is unobvious at the narrow weld with 0.2 mm width. The X-shape tip, the object of this research, has all narrow welding seams. The proposed solution using laser distance sensor can overcome this above challenge. This kind of sensor is cheap and simple to use. By adding rotational motion, this sensor can detect the welding seam easily. The big advantage is that this method can detect narrow welding seams.
Fig. 3 shows more details about the proposed positioning table of the X-shaped tip. The idea of the proposed solution is combining the motion of the welding part and the sensor system. This motion is separated from the robot to reduce the effect of vibration compared to method of attaching sensor system to robot. Each laser distance sensor has two motions including one translational motion and one rotational motion. The rotational motion makes the sensor detect the welding point at one position and the translational motion makes the sensor detect all the welding line. In this system, two laser distance sensors are used, one for detecting welding horizontal line and one for detecting welding vertical line. Combining with the rotational motion of the welding part, the proposed system can detect all the necessary welding lines of the X-shaped tip. Using only laser vision sensor attached to the robot cannot track all these welding lines because of the limitation of the movement of the robot.
2.3. Mathematical fundamental of proposed method
The second contribution of this paper is proposing a new weld seam tracking algorithm using laser distance sensor. The object of this research is fillet welding joints. A laser distance sensor combining with an angle sensor is used to determine the distance between the workpiece and the laser sensor. In one vertical plane, the laser sensor is controlled to scan a desired angle. This angle can be divided into many positions to detect the distance from the sensor to the workpiece. The main idea of this method is that the largest distance is the point that belongs to the intersection line between two planes including vertical plane and horizontal plane. This point is also on the welding line. In Fig. 4, the hypotenuse edge OA is larger than the side edge OB and OC because OAC and OAB are right triangles. With a limited rotation angle from OC to OB, OA is also maximum distance from the sensor to the workpiece. A is the point that belongs to the welding line. Thus, the laser distance sensor is attached on a translation axis. Each time of moving the sensor, the largest distance can be measured and the point on the welding line can be detected. Using this method, the horizontal welding lines and the vertical welding lines can be determined using laser distance sensors.
Fig. 5 shows the coordinate frame attached to the initial position of the sensor system. In this coordinate frame, O is the center of the sensor at the initial position, the axis OX is along the moving axis of the sensor, the axis OY is perpendicular to the vertical plane, and the axis OZ is point down and perpendicular to the horizontal plane.
In the proposed algorithm, L is the distance between the sensor and the object, α is the initial angle of the sensor and OY, θ is the small angle of each rotation of the sensor. If calling β is the rotation angle of the sensor and n is the number of rotations, the rotation angle β can be calculated using equation (1).
Assuming that L max is the maximum distance and received when rotating the sensor k times. At that time, the corresponding rotation angle is (α + kθ). Call Δx is the movement of the sensor along X-axis. The coordinates of the welding point at the i th movement are calculated by equations (2a), (2b), and (2c).
3. The proposed system configuration
3.1. The hardware system
Fig. 6 shows the overview of the hardware configuration of the automatic welding system of the X-shaped concrete tip. This system has three main components including the welding positioning table, the sensors system and the controller of the positioning table, and the welding robot.
The welding robot used in this research is MOTOMAN UP6. This is the 6-DOF robot for welding application of Yaskawa company. This robot has a horizontal range of 1373mm, a vertical range of 1673 mm, a repeatability of 0.08 mm, and a load capacity of 6 kg. The controller of the robot is MOTOMAN XRC Controller.
Fig. 7 shows the welding positioning table including one rotational movement and two translational movements. The table uses a cam indexer to make a rotation of each 90 degrees. A step motor combined with a belt driver is used to rotate the X-shaped tip to the working area of the sensors. The laser sensor is translated by a ball screw. Two ball screws are used, one for detecting the welding horizontal line and one for detecting the welding vertical line.
3.2. The laser distance sensor calibration and electrical diagram system
In this research, two laser distance sensors are used to detect the horizontal welding line and vertical welding line. The measurement method is based on the time-of-flight method. Each sensor is attached to a motor to control the rotation angle of the sensor. The used laser distance sensor is TW10S-UART, and its parameters are shown in Table I.
Because the coordinates of the welding point depend on the zero position of the sensor, it is very important to determine the zero position of the laser distance sensor. To remove the errors in manufacturing and assembling the system, zero position calibration is done. The sensor is controlled to rotate each one degree and calculates the distance between the sensor and the object. The zero position of the sensor happens when the distance is minimum. Fig. 8 shows the calibration process of the sensor that determines the horizontal welding line.
Fig. 9 shows the electrical wiring diagram of the system. Two types of power are used. The 24V power is used to supply 3 stepper motor drives and 3 proximity sensors. One motor is used to rotate the welding part, two other motors are used to translate the laser distance sensor along horizontal axis and vertical axis. The 5V power is used to supply a microcontroller (MCU), 2 RC motor, and 2 laser distance sensors. The MCU receives the signal from the proximity sensors and the laser distance sensor. The signal from the laser distance sensor is transmitted to the personal computer for calculating the welding trajectory. The MCU also controls the stepper motors and RC motors.
3.3. The graphical user interface of the robot
The data of the welding trajectory are measured from the laser distance sensors and the MCU. The position of the welding path is in the frame coordinate of the laser sensor. Thus, it is necessary to determine the transformation matrix between the sensor coordinate and the robot coordinate. After that, the MCU transmits the signal to the personal computer via RS232 protocol. Next, the data is transmitted from the personal computer to the controller of the robot via RS232 protocol too. Fig. 10 shows the graphical user interface of the welding robot on the personal computer. Using this user interface, the position data of the welding trajectory can be transmitted to the controller of the robot.
4. Automatic weld seam tracking algorithms
4.1. Proposed detecting weld seam algorithms using laser distance sensor
Fig. 11 shows the frame attached to the welding positioning table. The origin O is the center of the circular disk. OX is parallel to the axis of the ball screw that translate the laser sensor detecting the horizontal welding line. OY is on horizontal plane and positive direction is from the center point to outside. OZ is parallel to the axis of the ball screw that translate the sensor detecting the vertical welding line and the direction is from downside to upside.
Fig. 12 shows the main program algorithm for detecting automatic welding paths of the X-shaped tip. At the beginning, the HOME position of the horizontal ball screw and vertical ball screw is done. The beginning position of the rotational table is also detected. From that, the start positions of the horizontal sensor X 0n , Y 0n , Z 0n and the start positions of the vertical sensor X 0d , Y 0d , Z 0d are determined. After that, the horizontal welding line and the vertical welding line are determined by the sensors. The X-shaped concrete tip has 4 parts; thus, it needs to rotate the positioning table 3 times to detect all the welding part of the tip. The position data of the tip is transmitted to the control computer before transferring to the robot controller.
Fig. 13 shows the procedure for detecting the horizontal welding paths. The necessary welding paths include two horizontal lines, and the proposed algorithm will detect 13 points that stay on these two lines. The distance between these points along X-axis is 7.5 mm. To detect one point in the desired welding path, it needs to measure the distance L between the sensor and the welding part 24 times corresponding to 24 rotation angles from the initial angle α 0n . Among 24 values of the distance L, the maximum value is L max corresponding to the rotation of number k. The rotational angle of each step is θ. The coordinate of the detected welding point can be calculated using equations (3a), (3b), and (3c).
Fig. 14 shows the procedure for detecting the vertical welding paths. The process is like the procedure for detecting horizontal welding paths.
4.2. The improved algorithms
The accuracy of the above algorithm depends on the rotation angle of each rotation step. The smaller the rotation angle is, the more accurate the welding coordinates are. But the motor has a limited rotation angle depending on the resolution of the motor. On a plane that is perpendicular to the translational axis of the laser sensor, it exists as step i that the point i and point (i + 1) stay on two sides of the welding line, as shown in Fig. 15. It can be concluded that L i is the maximum distance from the sensor to the welding part. However, there is an error between the detected point and the desired welding point. Thus, the large rotation angle at each rotation step will give a large error of the welding point coordinates.
This section shows the improved algorithm that can reduce the error of the detecting welding point. In this algorithm, the coordinates of each measured point are collected. Totally, there are n measured point from O1 to On. These collected points are divided into two groups. The first group includes the points that stay on the vertical plane, number from O1 to Oi. The second group includes the points that stay on the horizontal plane, number from Oi + 1 to On. In each group, one intersection line is interpolated. The intersection point of these two intersection lines is the desired welding point. Fig. 16 shows respectively the real welding point (red point), the detected welding point (black point), and the improved welding point (blue point). It can be concluded that the new welding point using improved algorithm (blue point) has smaller error compared to the detected welding point (black point).
In the improved algorithm, the coordinate of X-axis is fixed in one scanning plane of the laser. The two other coordinates of Y-axis and Z-axis can be considered as 2D line. The least square method is used to interpolate the equation of the intersection line in vertical plane and the intersection line in horizontal plane and these lines are in the same plane, the scanning plane of the laser sensor. Assuming that the equations of these lines are shown in equations (4a) and (4b). Because just two coordinates of Y-axis and Z-axis are considered, a 2D line can be used to describe the relationship between them.
The intersection point between two intersection lines can be calculated using the equations (5a) and (5b). The position of the welding point can be calculated from Y i and Z i .
Finally, the coordinates of the welding point using improved algorithm are as follows:
5. Experimental results
This section shows the experimental results to verify the proposed algorithm in section 4. The hardware system used for the experiment is shown in Fig. 6. Figs. 17, 18, and 19 show the experimental process of detecting the welding part of the X-shaped concrete pile. In these figures, only one-fourth of the tip is detected. Figs. 17 and 18 show the experimental process of detecting the horizontal welding path and Fig. 19 shows the experimental process of detecting the vertical welding path.
Next, the comparison of 4 types of welding path is done including the real welding path, the detected welding path, the welding path using filter, and the welding path using the improved algorithm. Figs. 20, 21, 22, and 23 show the 4 types of welding path of 4 zones respectively. The real welding path is determined by the robot with high precision, shown in red lines. The detected welding path is determined by processing the data from the sensor system, shown in green lines. To remove the noise from the sensor, a mean filter is used to get the welding path, shown in black lines. Finally, the welding paths received from the improved algorithm are shown in blue lines.
It can be concluded that the detected welding path using laser distance sensors can track the real welding paths of the X-shaped concrete tip. However, the error of the detected path is about 6 mm, so large for GMAW welding application. The reason comes from the noise signal during measurement process. The error of the vertical welding path is larger than the error of the horizontal path. One of the reasons is because the laser sensor for detecting the vertical welding line is farther than the sensor used for detecting the horizontal welding path. In the mean filter method, a mean filter is added to the controller to remove the noise signal. The results of the mean filter method give the error of about 4 mm. The final method uses the above-improved algorithm to look for the intersection point between two intersection lines. The improved algorithm gives an error of about 2 mm. This error can be acceptable for GMAW welding application.
Several previous research using laser vision sensor can have seam tracking error of about 0.5 mm [Reference Quan and Bi39, Reference Cibicik, Njaastad, Tingelstad and Egeland46, Reference Fang and Xu47]. Other research using vision sensor has error of about 1 to 2.5 mm [Reference Takubo, Miyake, Ueno and Kubo40, Reference Dinham41]. Rotating arc sensor was also used for seam tracking with the error of about 1 mm [Reference Le, Zhang and Chen45]. It can be concluded that the method using laser vision sensor or vision sensor can have smaller errors than the proposed method. However, the cost for the proposed system is cheaper than advanced weld seam tracking methods. One reason makes the error of proposed method come from the mechanical system. The motor that controls the rotation of the laser distance sensor is RC motor. The resolution of this motor is not good and can give large errors when rotating at small angles. In the next step, the step motor will be used to reduce this type of error and can reduce the seam tracking error of all systems.
Figs. 24, 25, 26 show the tracking welding path of the robot in one zone after receiving the welding path from the improved algorithm. From the experiment, it can be concluded that the welding gun can track the welding paths.
6. Conclusion
In this paper, an algorithm for welding seam tracking using laser distance sensor is proposed. The fundamental mathematics of the algorithm is presented. The positioning table system supports the procedure is designed and manufactured. Two detected algorithms of welding part are proposed including the method using laser distance sensor and the improved method. The experimental results show that the accuracy of the proposed method can be acceptable. In the future, the welding experiment will be done to verify the proposed method.
Acknowledgements
We acknowledge the support of time and facilities from Ho Chi Minh City University of Technology (HCMUT), VNU-HCM for this study.
Author contributions
Cao Tri Huynh and Tri Cong Phung conceived and designed the study. Cao Tri Huynh conducted data gathering. Tri Cong Phung performed statistical analyses. Tri Cong Phung wrote the article.
Financial support
This research is funded by Vietnam National University HoChiMinh City (VNU-HCM) under grant number B2021-20-03.
Competing interests
The authors declare no competing interests exist.
Ethical approval
None.