Research

Sensor-based Task Oriented Grasp Synthesis

One of the key challenges in task-oriented grasp synthesis is to mathematically represent a task. In our work, we represent a task as a sequence of constant screw motions. Given a grasp (pair of antipodal contact locations) we can evaluate its feasibility for imparting the desired constant screw motion using our proposed task-dependent grasp metric. We have also developed a neural network-based approach which solves the inverse problem, i.e. given an object representation in terms of a partial point cloud, obtained from an RGBD sensor, and a task in terms of a screw axis, compute a good grasping region for the robot to grasp the object and impart the desired constant screw motion. This task representation also allows us to couple our approach for task-oriented grasp synthesis with screw geometry-based motion planners. For more details please visit the project page. More recently, we have formalized the notion of regrasping in order to satify the motion constraints. Using our task-dependent grasp metric and a manipulation plan we are able to compute whether there is a need to regrasp an object while executing the manipulation plan or a single grasp would suffice.

Motion Planning for Manipulation

Through our work, we have shown that the screw-geometric representation of motion is a general way of representing motion plans in the task space which is a subset of SE(3). This is representation can be used for tasks which involve contact with the environment as well as for complex manipulation tasks like scooping and pouring where it is difficult to specific the task constraints beforehand.
Representing complex manipulation tasks, like scooping and pouring, as a sequence of constant screw motions in SE(3) allows us to extract the task-related constraints on the end-effector’s motion from kinesthetic demonstrations and transfer them to newer instances of the same tasks. This approach has been evaluated for complex manipulation tasks like scooping and pouring and also in the context of vertical containerized farming for transplanting and harvesting leafy crops.

Design of a Novel Robotic Hand

With a view towards building compact reliable dexterous robotic hands, we have explored the fundamental problem of designing a robotic finger with abduction/adduction and flexion/extension capability and size similar to a human finger such that the fingertip motion and forces can be controlled. We have developed a novel 3-degree-of-freedom (3-DoF) series-parallel hybrid mechanism for a robotic finger that is capable of abduction/adduction and flexion/extension. We present the complete position kinematics and differential kinematics of our proposed finger mechanism and show through simulation examples that the fingertip can be kinematically controlled to follow a given path. Both position control and velocity control capabilities are demonstrated. For more details please refer to the papers and the YouTube video

Manipulation while considering contacts

To enhance the ability of robotic hands and manipulators, to grasp and manipulate a wide range of objects, the environment can be effectively exploited. Successful accomplishment of many tasks involves exploitation of contacts of the grasped object with the environment. In our lab, we have developed algorithmic approaches for motion and force planning for manipulating objects by exploiting the environment while considering the nonlinear contact constraints.