Visual Feedback Control and Transfer Learning-Based CNN for a Pick and Place Robot on a Sliding Rail

Fourth Author's Department

Mechanical Engineering Department

Find in your Library

https://doi.org/10.1109/ICMA52036.2021.9512777

Document Type

Research Article

Publication Title

2021 IEEE International Conference on Mechatronics and Automation, ICMA 2021

Publication Date

8-8-2021

doi

10.1109/ICMA52036.2021.9512777

Abstract

Among the various types of deep neural networks (DNNs), convolutional neural networks (CNNs) have ingenious structures and are widely used for image recognition and/or defect inspection. The authors already developed a design, training and test tool for CNNs and support vector machines (SVMs) to support defect detection of various kinds of manufactured products, while showing the effectiveness and the userfriendliness through classification experiments using images of actual products. The tool further enables to view where the most activated area in each classified image is. Besides the tool, a desktop-sized pick and place (PP) robot was also proposed while implementing a pixel-based visual feedback (VF) controller to autonomously reach target objects. In addition, a CNN designed based on transfer learning concept was developed to estimate objects' orientations. In this paper, a sliding rail is considered to allow the articulated robot to move around in a wider working range. The VF controller is extended to utilize the sliding rail. The usefulness and userfriendliness of the robot system using the sliding rail is confirmed through PP experiments of randomly put objects on a table.

First Page

697

Last Page

702

This document is currently not available here.

Share

COinS