The ability to grasp and manipulate objects provides an essential means to interact with the environment. Recent years have seen a proliferation of research projects to use robotic manipulation in real world applications such as human robot collaboration and industrial tasks. Despite the promising progress, robotic grasping and manipulation has yet to demonstrate necessary robustness and dexterity to be fully exploited in various settings, such as in everyday life contexts, industrial environments, and when dealing with novelty and uncertainty, e.g., object shape, pose, weight, friction at contacts, and with unstructured environments.
Studies on human grasping and manipulation have shown that sensorial capabilities play a key role in the success of human manipulation, allowing a better perception of the object and the interaction with it, and revealing adaptation and control strategies, e.g., using environment and its constraints for more effective manipulation. Inspired by these findings, robotics research aiming to robustify object grasping and manipulation skills shows the importance of effective use of sensory data (visual, tactile, proprioceptive) from planning stage to task completion. Various kinds of approaches have been proposed, e.g., data-driven and empirical approaches such as learning from experience and from human demonstration, analytic approaches such as modelling physical and dynamical constraints manually, and approaches to hand designs such as under-actuated and soft hands.
In this workshop, we aim to bring together researchers and experts in key areas for grasping and manipulation such as perception, control, learning, design of hands and grippers, and studies analysing human manipulation skills. We aspire to identify recent developments in these research areas, both in theory and applications, discussing recent achievements, debating underlying assumptions, and challenges for future progress.
Topics of interest:
The workshop topics include (but are not limited to):
- Perception-guided grasping and manipulation (vision, touch)
- Grasp and manipulation planning
- Learning for grasping and manipulation (e.g., from human demonstration, exploration)
- Collaborative manipulation
- Bi-manual manipulation
- Visual, tactile servoing
- Closed-loop grasping and manipulation
- End-effector design (e.g., anthropomorphic, underactuated)
- Human manipulation and grasping
- Reactive control strategies for object manipulation
- Deformable object manipulation
- Multimodal interactive perception
- Sensor fusion based on tactile, force and visual feedback
- In-hand manipulation
Call for Papers:
We welcome the submission of two page extended abstracts describing new or ongoing work. Final instructions for poster presentations and talks will be available on the workshop website after decision notifications have been sent. All abstracts will be accessible on the workshop website. Submissions should be in .pdf format. Please send submissions to valerio[dot]ortenzi[at]qut[dot]edu[dot]au with the subject line “Humanoids 2017 Workshop Submission”. For any question or clarification, please contact the organisers.
Abstract submission deadline: October 25, 2017
Acceptance notification: November 5, 2017
Final materials due: November 10, 2017
Workshop date: November 15, 2017
Object Modeling and Grasping Pipeline based on Superquadric Models, Giulia Vezzani, Ugo Pattacini and Lorenzo Natale, (pdf)
Towards Reactive and Robust Manipulation Tasks using Behavior Trees, Michele Colledanchise and Lorenzo Natale, (pdf)
Markerless visual servoing for humanoid robot platforms, Claudio Fantacci, Ugo Pattacini, Vadim Tikhanoff and Lorenzo Natale, (pdf)
Generative Perception for Robotic Grasping, Douglas Morrison, Peter Corke and Jürgen Leitner, (pdf)
Hierarchical Grasp Detection for Visually Challenging Environments, D. Morrison, N. Kelly-Boxall, S. Wade-McCue, P. Corke, and J. Leitner, (pdf)
A Framework for Bimanual Folding Assembly Under Uncertainties, Diogo Almeida and Yiannis Karayiannidis, (pdf)
Simulation of the underactuated Sake Robotics Gripper in V-REP, Simon-Konstantin Thiem, Svenja Stark, Daniel Tanneberg, Jan Peters and Elmar Rueckert, (pdf)