Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator

Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator

This work deals with the likelihood of merging a 3D sensor into a robotic manipulator, with an objective to automatically detect, track and grasp an object, placing it in another location. To enhance the flexibility and easy functionality of the robot, MATLAB, a versatile and powerful programming language is used to control the robot. For this work, a common industrial task in many factories of pick and place is implemented. A robotic system consisting of an ABB IRB120 robot equipped with a gripper and a 3D Kinect for Windows camera sensor is used. The threedimensional data acquisition, image processing and some different parameters of the camera are investigated. The information in the image acquired from the camera is used to determine the robot’s working space and to recognize workpieces. This information is then used to calculate the position of the objects. Using this information, an automatic path to grasp an object was designed and developed to compute the possible trajectory to an object in real time. To be able to detect the workpieces, object recognition techniques are applied using available algorithms in MATLAB’s Computer Vision Toolbox and Image Acquisition Toolbox. These give information about the position of the object of interest and its orientation. The information is therefore sent to the robot to create a path through a server-to-client connection over a computer network in real time.Keywords: Kinect, object recognition, 3D vision system, MATLAB, RobotStudio

___

  • [1] J. Hill and W. Park, "Real time control of a robot with a mobile camera," 9th International Symposium on Industrial Robots, pp. 233–246, 1979.
  • [2] K. Rezaie, S. Nazari Shirkouhi, and S.M. Alem, "Evaluating and selecting flexible manufacturing systems by integrating data envelopment analysis and analytical hierarchy process model," Asia International Conference on Modelling and Simulation, pp. 460–464, 2009.
  • [3] K. Hashimoto, "Visual Serving: Real Time Control of Robot Manipulators Based on Visual Sensory Feedback," 1993.
  • [4] Hutchinson, F. and Chaumette, S., "Visual servo control basic approaches," Robotics & Automation Magazine, IEEE, vol. 13, pp. 82- 90, 2006.
  • [5] F. C. S. Hutchinson, "Visual servo control. ii. advanced approaches [tutorial]," Robotics & Automation Magazine, IEEE, vol. 14, pp. 109–118, 2007.
  • [6] D. Kragic, and H. I. Christensen, "Survey on visual servoing for manipulation," Computational Vision and Active Perception Laboratory Fiskartorpsv, vol. 15, 2002.
  • [7] H. Wu, W. Tizzano, T. Andersen, N. Andersen, and O. Ravn, "Hand-Eye Calibration and Inverse Kinematics of Robot Arm using Neural Network," Springer, pp. 581–591, 2013.
  • [8] H. Wu, L. Lu, C.-C. Chen, S. Hirche, and K. Khnlenz, "Cloud-based networked visual servo control," I E E E Transactions on Industrial Electronics, vol. 60, no 2, pp. 554 – 566,, 2013.
  • [9] Meyer, R. D. and Schraft, C., "The need for an intuitive teaching method for small and medium enterprises," ISR Robotik, Germany, 2012.
  • [10] B. Akan, "Human Robot Interaction Solutions for Intuitive Industrial Robot Programming," Västerås: Mälardalen University, 2012.
  • [11] Juan R. Terven and Diana M. Cordova, Kin2 User Guide, 2016.
  • [12] MathWorks, "Image Processig Toolbox User’s Guide," 2014.
  • [13] N. B. Fernández, "“Generación de trayectorias y evitación de obstáculos para el robot IRB120 en entorno Matlab”," UNIVERSIDAD DE ALCALÁ, pp. 47, 2015.
Sakarya Üniversitesi Fen Bilimleri Enstitüsü Dergisi-Cover
  • ISSN: 1301-4048
  • Yayın Aralığı: Yılda 6 Sayı
  • Başlangıç: 1997
  • Yayıncı: Sakarya Üniversitesi Fen Bilimleri Enstitüsü