The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a collaborative robot in less than a day. This will be accomplished by augmenting the robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task. The overall conceptual approach is that the robot should be capable of learning and executing assembly tasks in a human-like manner. Studies will be made to understand how human assembly workers learn and perform assembly tasks. The human performance will be modeled and transferred to the collaborative robot as assembly skills. The robot will learn assembly tasks, such as insertion or folding, by observing the task being performed by a human instructor. The robot will then analyze the task and generate an assembly program, including exception handling, and design 3D printable fingers tailored for gripping the parts at hand. Aided by the human instructor, the robot will finally learn to perform the actual assembly task, relying on sensory feedback from machine vision, force and tactile sensing as well as physical human robot interaction. During this phase the robot will gradually improve its understanding of the assembly at hand until it is capable of performing the assembly in a fast and robust manner.
|SARAFun targeted Breakthrough
The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a robot in less than a day. This will be accomplished by augmenting the robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task.
The main objective of SARAFun is to enable for a robot system to “learn and execute assembly operations in a human like manner”. While the project concept is to learn and take inspiration from how humans learn and perform assembly, the objective is not to entirely mimic this but rather to study how the human capabilities can be transcribed to the robot system in a meaningful and industrially relevant way. The project will work with the collaborative robot, which has some similarities with human motion capabilities but also some limitations, the biggest one utilizing parallel grippers instead of dexterous articulated hands with sensing. The normal operation of the robot is currently position control. While this gives fast, precise and repeatable motions it is not well suited to deal with the inherently unstructured assembly task, where variations in part locations and tolerances make a pure position approach very cumbersome to realize. In the project the robot will be fitted with additional sensing capabilities as well as cognitive abilities inspired by the human process of understanding and learning assembly tasks.
|The research leading to these results has received funding from the European Community’s Framework Programme Horizon 2020 – under grant agreement No 644938 – SARAFun.|