Towards human level of dexterity

Human dexterity is extremely hard to replicate, and we are just in the beginning. Yasemin Bekiroglu is on of Chalmer's new recruitments in AI and robotics. She is now building her group with a wide approach ranging from simpler tasks to solving more complex ones such as assembly.​
​Yasemin Bekiroglu is one of Chalmersí new recruitments in AI and robotics. She is now an Assistant Professor at Chalmers Department of Electrical Engineering in the Automatic control group and her research is supported through a common initiative between CHAIR and Chalmersí gender initiative for excellence, Genie. 


– The
 central question of my research is how we can create robots that are capable of adapting such that they can co-inhabit our world. This means designing systems that are capable of functioning in unstructured environments that are continuously changing with unlimited combination of shapes, sizes, appearance, and positions of objects; adapting to changes and learning from experience and humans, and importantly do so from small amounts of data. In particular, my work focuses on grasping and manipulation, fundamental aspects to enable a robot to interact with the environment, humans and other agents. Iím fascinated by how sophisticated human hands are and very interested in mimicking human dexterity using robotic arms and hands in achieving grasping and manipulation tasks, says Yasemin Bekiroglu.


She is in the beginning of her work at Chalmers, and the goal is to have a group with a wide perspective. The research is therefore kept generic and free from limitations in terms of functionality.


– The
 human dexterity is extremely hard to replicate, and we are just in the beginning. Right now, we are looking into both simple movements for robotic arms and hands, such as picking up objects of varying shapes, sizes and materials, but we are also aiming at more delicate and complex tasks such as assembly and insertion, says Yasemin Bekiroglu.


Mostly they work on collaborative robots. The approach is to not re-invent the wheel. Her group uses already existing hardware, studying how to best equip them with various sensing and perception capabilities, and making them more intelligent. The first step is to build a framework that can be extended, starting with simpler tasks such as letting the robotic hand pick up something, then focusing on more complex manipulation tasks that can be learned from demonstration and experience.  
 


– Most
 of our initial data comes from vision sensors to plan an action. We aim to complement vison data with other sensory data, such as from touch sensors, and leverage it similarly to the way humans combine vision with touch to perceive, grasp and manipulate a previously unseen object. A key technical difficulty is how to encode, reason about and overcome the uncertainties that are inherent in both robotic perception and also in the robotís physical interactions with objects.î, says Yasemin Bekiroglu.


– But there is a long way to go before we have robots that are equipped with perception and interaction skills close to human level in the way we are able to act in and adapt to our environment. 
Our research goal is to create assistants and collaborative robots that will facilitate human work and life, she says.


Yasemins robotarm.jpg

Yasemins robotarm2.png
Example grasping experiments, picking up novel and arbitrarily shaped objects, from a clutter using a robotic setup that consists of a 7 degrees of freedom arm equipped with a parallel jaw gripper with flat fingers. The image is adapted from "Benchmarking Protocol for Grasp Planning Algorithms Authors, Bekiroglu et al, IEEE Robotics and Automation Letters (RA-L), 2019.


Page manager Published: Tue 07 Sep 2021.