Planning for Optimal Robot Trajectory for Visual Inspection
Ahmad Yusairi Bani Hashim1,, Nur Sufiah Akmala Ramdan1, Radin Puteri Hamizah Radin Monawir1, Mohd Nazrin Muhammad1
1Department of Robotics and Automation, Faculty of Manufacturing Engineering, Universiti Teknikal Malaysia Melaka, Durian Tunggal, Melaka, Malaysia
Abstract
Automatic inspection is common in mass production inspections where robot manipulators are chosen to perform visual inspection to avoid inconsistency in manual inspection. The purpose of this work is to estimate the optimum workspace where a robot manipulator could perform visual inspection task onto a work piece where a camera is attached to the end effector. While maneuvering through the programmed path, the robot will stop at a predefined point so that an image could be captured where the ideal parameter for the coefficient correlation (CC) template matching was computed. The template distance selection, the threshold adjustment and the average image processing speed for CC were determined. The pseudo codes for the planned path are derived from the number of tool transit points, the delay time at the transit points, the process cycle time, and the configuration space that the distance between the tool and the work piece. It was observed that express start and swift end are acceptable in a robot program because applicable works usually in existence during these moments. However, during the mid-range cycle, there are always practical tasks programmed to be executed. For that reason, it is acceptable to program the robot such as that speedy alteration of actuator displacement is avoided. A dynamic visual inspection system using a robot manipulator seems practical where an optimal distance between an object face and a camera is determined in priori using the CC method.
At a glance: Figures
Keywords: robot manipulator, visual inspection, coefficient correlation
Journal of Automation and Control, 2014 2 (1),
pp 15-20.
DOI: 10.12691/automation-2-1-3
Received October 26, 2013; Revised December 30, 2013; Accepted January 24, 2014
Copyright © 2013 Science and Education Publishing. All Rights Reserved.Cite this article:
- Hashim, Ahmad Yusairi Bani, et al. "Planning for Optimal Robot Trajectory for Visual Inspection." Journal of Automation and Control 2.1 (2014): 15-20.
- Hashim, A. Y. B. , Ramdan, N. S. A. , Monawir, R. P. H. R. , & Muhammad, M. N. (2014). Planning for Optimal Robot Trajectory for Visual Inspection. Journal of Automation and Control, 2(1), 15-20.
- Hashim, Ahmad Yusairi Bani, Nur Sufiah Akmala Ramdan, Radin Puteri Hamizah Radin Monawir, and Mohd Nazrin Muhammad. "Planning for Optimal Robot Trajectory for Visual Inspection." Journal of Automation and Control 2, no. 1 (2014): 15-20.
Import into BibTeX | Import into EndNote | Import into RefMan | Import into RefWorks |
1. Introduction
Manual inspection is still the choice for final inspection in the quality controls. However, inconsistency in human inspection is real because humans can get tired after some time. Human is incapable of focusing on the repeating work. It is difficult and costly to hire and train human experts. It is fair to claim that human cannot achieve consistency as compared to automatic inspection, such as using a vision system. In fact, there are also cases where inspection tends to be tedious or difficult, even for the best-trained expert [1]. It is suggested that an optical system composed of an LCD projector, a digital camera and a microcomputer may be used to measure a three-dimensional model [2].
Recent trends show that automatic inspection is more appealing for mass production inspections. In such as a case, a robot manipulator seems the best candidate to run a dynamic visual inspection. Robot manipulator is a machine formed by a mechanism, including a number of degrees of freedom, often having the appearance of one or some arm ends in a wrist capable of holding a tool, a work piece or an inspection device [3]. A manipulator is a mechanical unit that provides motions or trajectories similar to those of a human arm and hand. A robot manipulator, on the other hand, provides trajectories by programmed instructions.
On a manipulator, the end of the wrist can reach a fixed point having a particular set of coordinates and in a specific orientation. This is realized by the application of an end effector where the end of the wrist in a robot is equipped with an end-of-arm tooling. An end effector may be equipped with tooling such as [4]: i) a gripper, a hook, a scoop, an electromagnet, a vacuum cup, an adhesive finger for material handling; ii) a spray gun for painting; iii) an attachment for spot and arc welding and cutting; iv) a power tool [5]; v) a measuring instrument.
Robot manipulator can be classified by two basic structural types: the parallel structure and the serial structure [6]. For a serial structure, it is constructed in such a way as to form the shape of an elbow, wrist and shoulder [7]. One of the major advantages of the serial type over the parallel type is its workspace that is larger than that of the parallel type.
The purpose of this work is to estimate the optimal workspace where a robot manipulator performs a visual inspection process onto a work piece where a camera is attached to the end effector. While planning for the trajectory, the joints’ position, velocity, and acceleration are carefully planned that the obstacles are to be avoided, and the tool is to travel along the shortest path, and the cycle time is within a desired one.
A vision system that is fixed onto a location has a limited image capturing area. The idea is to attach a camera onto the end effector. For the robot while in motion, the camera will have a wider capturing area and at an ideal angle with respect to a complex shaped work piece.
2. Methodology
This work carries out the following sequence of activities: identification of the method in path and trajectory planning; designing the program structure; developing the robot program; and assessment the robot’s paths and trajectories.
2.1. Path and Trajectory PlanningA robot manipulator performs pre-planned tasks by controlling the rate of movement of its actuators. This process is known as the trajectory planning. The rates and sequences of actuator movement are programmed so that a desired path is obtained. A continuous map, with with has a path that begins at a start point to final point in a configuration space. A trajectory is a function of time such that and . The difference is the time taken to execute the trajectory [7]. The trajectory itself explains the rate of actuator rotation on individual manipulator joint. Therefore, there are six of this functions that define for the trajectories of six actuators.
Consider a cubic trajectory based on a single joint trajectory. For a scalar joint variable, at time the joint variable satisfies constraint (1). A cubic trajectory and its derivative are defined in (2), whereas (3) is the algebraic representation of the trajectory. The constraints and definitions stated in (1), (2), and (3) are used to plan for robot’s tool path where the actuator’s angles of rotation, velocities and accelerations are realized through robot’s program.
(1) |
(2) |
(3) |
The pseudo codes for a planned path are derived from the number of tool transit points, the delay time at the transit points, the process cycle time, and the configuration space that the robot makes. Table 1 shows the transit points (vertex-position), the assigned coordinates, the path that passes through the transit points. There are eight vertices within the path that the tool has to travel along.
The robot manipulator was used to realize the automatic visual inspection where the manipulator’s function was to bring the camera to the pre-defined locations within the configuration space. The camera shootings were realized by another separate system. Figure 1 exhibits the experiment setup that consists of the robot itself, a work piece that is a car door, and a camera that is attached onto the end effector. The white markings were plastic straws that functioned as indicators that limit the configuration space. They provided the limit within which the camera should reach but to avoid any obstacles. Prior to the realization of the inspection process, the setup was simulated in Workspace. Figure 2 shows the simulated environment of the setup. The results obtained from the simulation would explain the behavior of the individual actuator and this would reflect to equations (1), (2), and (3).
While maneuvering through the programmed path, the robot will stop at the predefined point so that an image could be captured. Figure 3 shows example of the points where image areas are selected such as shown in Figure 3d and Figure 3e. This sub process is conducted to find the optimum parameter for the coefficient correlation (CC) template matching. The template distance selection, the threshold adjustment and the average image processing speed for CC are determined. The sample images are tested while manipulating threshold value and the distance between camera and object . The sample of images captured within the range .
3. Results and Discussion
The realization of the experimental setup that was based on the simulation environment is shown in Figure 2. The image frames are the four of the selected robot motion images during the inspection process. The frames show the transit points upon which the camera captured images of areas that were likely to have defects. There were five points where the tool needed to stop for a moment during which the camera would capture images of the defect areas. The images were then uploaded for image processing to determine the degrees of defects.
Prior to robot initiation, the results obtained from Workspace (see Figure 4) explain the joints' behaviors. The behaviors dictated that in order to complete a cycle of visual inspection; the joints would need to displace on some degrees of angular position. It is observed, in Figure 4, that all joints were active during the inspection process from start to end. If this were to be implemented onto the actual robot program, there would have been unnecessary energy usage. From Figure 4, joint–4 is seen to consume the most energy, whereas joint–6 was the least. The simulated cycle time was less than 80 seconds.
Based on the simulation results, it was observed that not all joints needed to be controlled in order to achieve the optimum configuration space following the specifications listed in Table 1. Figure 5 shows the actual joints’ behavior for a cycle time of 25.02 seconds. It is less than that the simulation cycle time because of the improved configuration space. In Figure 5, it is observed that all joints displacement was moderated where each displaced accordingly. All joints were functioning while the tool in motion.
There were rapid speed changes where all joints were functioning throughout the cycle (see Figure 6), whereas in the actual setup, most of the joints are somewhat steady during the mid-range of the cycle (see Figure 7). The mid-range of the cycle was where the inspection was carried out. During simulation, the robot had a rapid start and a fast end. Similarly, in the actual implementation, the trajectories were fast start and rapid end. This is further explained in Figure 8 and Figure 9 where the robot had slight decelerations in simulation environment, but with considerable rapid decelerations in the actual setup.
In general, a template of an image at distance L = X is suitable to be match with image taken through a template at the same distance. A template of image at = Y, however, could match a template of image at L = X [8]. Therefore, this work implemented the template of image as such where the first parameter of the algorithm that was the template distance was found. Template size of for images at distance range from was tested. Template of L = 35, however, was not taken as a sample template because the distance was too far. In fact, an image captured through the template could result in loss of image features, hence poor accuracy.
The results are listed in Table 2. For L = 15 cm, the image retains most of its features. The value 3/3 dictates that 3 out of 3 points were detected. The template cropped from L = 15 cm was found to be acceptable for short ranges: of 15cm and of 20 cm, respectively. The degree of detection, however, was the lowest—26.67%. Conversely, there were two cases where the degree of detection was the highest: at L = 20 cm and at L = 30 cm, respectively. These templates could detect more points at different distances. Nevertheless, the template with L = 20 cm was selected to reduce the risk of losing image features. Figure 11 depicts the templates that match the points based on the CC method. The points are shown within the boxes in Figure 11b. In addition, Figure 12 explains, by the correlation measurement space, how the points were detected by chromatic scale such that the equivalent value of 0.8 represented the detected region.
4. Conclusion
A trajectory of an individual joint may be observed from the plots of respective curves. From the curves, the planner may program the robot such that the joints will be instructed to rotate by means of actuators that intake some amount of electric current from the driver circuits. This work provides a solution to automatic visual inspection where a camera is attached onto the end effector of a manipulator. The camera will capture images at pre-defined stopover and at a preset distance between the camera and the object’s face. A special configuration space is designed so that the cycle time is optimal where the camera will be at the right pose at every transit point. An express start and a swift end are acceptable in a robot program because useful works usually do not exist during these moments. However, during the mid-range cycle, there are always useful tasks to be executed. For that reason, it is acceptable to program a robot such as that speedy alteration of actuator displacement is avoided. In short, a dynamic visual inspection system using a robot manipulator seems practical where an optimal distance between an object face and a camera is determined in priori using the CC method.
References
[1] | E. N. Malamas, E. G. Petrakis, M. Zervakis, L. Petit, and J.-D. Legat, “A survey on industrial vision systems, applications and tools,” Image and Vision Computing, vol. 21, no. 2, pp. 171-188, Feb. 2003. | ||
In article | CrossRef | ||
[2] | S. Del Vecchio, P. A. de Araújo, J. C. C. Rubio, M. Pinotti, and M. Sesselmann, “3D measurement of human plantar foot by projection moiré technique,” Int. J. Mechatronics and Manufacturing Systems, vol. 5, no. 1, pp. 3-16, 2012. | ||
In article | |||
[3] | S. Kalpakjian and S. Schmid, Manufacturing Engineering & Technology (7th Edition). Prentice Hall, 2013, p. 1224. | ||
In article | |||
[4] | F. L. Lewis, D. M. Dawson, and C. T. Abdallah, Robot Manipulator Control: Theory and Practice. CRC Press, 2002, p. 430. | ||
In article | CrossRef | ||
[5] | Y. Li, W. Wang, H. Li, and Y. Ding, “Feedback method from inspection to process plan based on feature mapping for aircraft structural parts,” Robotics and Computer-Integrated Manufacturing, vol. 28, no. 3, pp. 294-302, Jun. 2012. | ||
In article | |||
[6] | C. D. Crane III and J. Duffy, “Kinematic Analysis of Robot Manipulators,” Jan. 1998. | ||
In article | |||
[7] | M. W. Spong, S. Hutchinson, and M. Vidyasagar, Robot Modeling and Control. Wiley, 2005, p. 496. M. | ||
In article | |||
[8] | Nixon and A. S. Aguado, Feature Extraction & Image Processing (Google eBook). Academic Press, 2008, p. 424. | ||
In article | |||