< Back to previous page

Publication

Combining Modeled and Learned Information to Rapidly Deploy Human-Robot Collaborative Tasks

Book - Dissertation

One of the main challenges for emerging robots in industry 4.0 is the capability to expedite and facilitate the deployment of a large variety of robot applications in non-structured and unpredictable environments. Typically, the cost associated with the time and expertise needed for programming these robot applications correspond to a substantial share of the total costs of an automation project. Moreover, it is expected that this share may further increase because manufacturers are increasingly demanding flexibility (applications with smaller lot sizes and larger product variety), more advanced programming (including sensor-based robot control), and rapid reconfiguration (one robot will handle multiple scenarios or even multiple applications). To tackle these challenges, this research introduces a flexible framework that enables a user to combine modeled and learned information to expedite the deployment of sensor-based applications. These applications are predictable and robust against uncertainties associated with sensor information and unstructured environments, thereby enabling humans to collaborate with the robots. To this end, we formulate sensor-based robot applications as constrained optimization problems using the constraint-based framework expression-graph based Task Specification Language (eTaSL). This framework enables the definition of the robot control problem, not only in terms of degrees of freedom (DOF) associated with the robot, i.e., joint variables, but also in terms of feature variables, used to facilitate the definition of DOF that govern a manipulation task (e.g. the definition of a motion along a path in terms of the degree of advancement). This separation enables us to formalize an extensible library of sensor-based robot behaviors that separates aspects of the robot, the task, and the environment, allowing their intuitive composition and transferability between robot platforms, environments, and robot applications. These behaviors comprise aspects of robot tasks such as generation of pose and wrench profiles, reactive evolution along these profiles, robot motion, collision avoidance, relative pose between a tool and a workpieces, joint limits, and workspace limits. As a first contribution, this framework integrates the Learning from Demonstration (LfD) methodology Trajectory parameterized Probabilistic Principal Component Analysis (traPPCA) into eTaSL, thereby incorporating information acquired from demonstrations into the robot control problem. This integration facilitates the commissioning of reactive sensor-based robot applications while limiting the number of demonstrations, increasing the predictability of trajectories generated towards non-demonstrated targets, and providing a guideline for the hyper-parameter selection. Furthermore, this methodology facilitates the commissioning of robot applications consisting of approach and contact tasks by enabling the acquisition of pose and wrench information using a device equipped with pose tracker and Force and Torque (F/T) sensor. As a second contribution, this framework is extended to include a proximity-based collision avoidance behavior based upon massive sensor input from an artificial robot skin. The proposed reactive behavior takes as input the proximity signals of a selected number of cells enabling the robot to avoid collisions by holding its motion, deviating from, or moving backward along a nominal trajectory. As a result, the proposed behavior enables human and robots to collaborate while sharing the workspace. As a third contribution, this framework is further extended to integrate composable behaviors that enable the generation of goal-driven motions in state space manifold. These manifolds are defined in terms of variability encoded from demonstrations, deformations of local sections of the trajectories, and tolerances in the tool-workpiece relative pose. The proposed behavior also includes non-demonstrated sources of variability such that its adaptability in the presence of a cluttered environment and dynamic obstacles is improved. As a proof of concept, and to show the flexibility of our framework we developed and deployed six industrially relevant applications: orange picking and packing, solenoid-assembling, cluttered bin picking and packing, bottle-opening, 2D contour-tracking, and ultrasonic welding for mask production. The last application corresponds to the prototype of a process in a production line designed to alleviate the shortage of face masks in Belgium.
Publication year:2020
Accessibility:Open