Robotic systems require the use of sensing to enable flexible operation in uncalibrated or partially calibrated environments. Recent work combining robotics with vision has emphasized an active vision paradigm where the system changes the pose of the camera to improve environmental knowledge or to establish and preserve a desired relationship between the robot and objects in the environment. Much of this work has concentrated upon the active observation of objects by the robotic agent. We address the problem of robotic visual grasping (eye-in-hand configuration) of static and moving rigid targets. The objective is to move the image projections of certain feature points of the target to effect a vision-guided reach and grasp. An adaptive control algorithm for repositioning a camera compensates for the servoing errors and the computational delays that are introduced by the vision algorithms. Stability issues along with issues concerning the minimum number of required feature points are discussed. Experimental results are presented to verify the validity and the efficacy of the proposed control algorithms. We then address an adaptation to the control paradigm that focuses upon the autonomous grasping of a static or moving object in the manipulator's workspace. Our work extends the capabilities of an eye-in-hand system beyond those as a 'pointer' or a 'camera orienter' to provide the flexibility required to robustly interact with the environment in the presence of uncertainty. The proposed work is experimentally verified using the Minnesota Robotic Visual Tracker (MRVT)  to automatically select object features, to derive estimates of unknown environmental parameters, and to supply a control vector based upon these estimates to guide the manipulator in the grasping of a static or moving object.
|Original language||English (US)|
|Number of pages||34|
|Journal||Journal of Intelligent and Robotic Systems: Theory and Applications|
|State||Published - 1997|
Bibliographical noteFunding Information:
This work has been supported by the Department of Energy (Sandia National Laboratories) through Contracts #AC-3752D and #AL-3021, the National Science Foundation through Contracts #IRI-9410003 and #CDA-9222922, the Center for Transportation Studies through Contract #USDOT/DTRS 93-G-0017-1, the Army High Performance Computing Center and the Army Research Office through Contract #DAAH04-95-C-0008, the 3M Corporation, the McKnight Land-Grant Professorship Program, the Graduate School of the University of Minnesota, and the Department of Computer Science of the University of Minnesota. We would also like to thank John Fischer for the design of the hardware interface to allow computer control of the PUMA’s gripper under Val II’s Alter facility.
- Active and real-time vision
- Experimental computer vision
- Systems and applications
- Vision-guided robotics