Lab Projects

Selected Projects

1. Model Predictive Control Based Haptic Guidance For Performance Improvement: Furthering Surgical Training

2. Real-time Haptic Guidance for Minimally Invasive Surgery (MIS) Training Simulation

3. Haptic-Enabled Mobile Robotics: Acceleration Based Variable Force Feedback in Obstacle-Avoidance Task

4. Providing Haptic Feedback in Robot-Assisted Minimally Invasive Surgery: A Direct Optical Force Sensing Solution for Haptic Rendering of Deformable Bodies

5. Automated Ligation Device for Laparoscopic Surgery

Project Highlights

1. Model Predictive Control Based Haptic Guidance for Performance Improvement: Furthering Surgical Training

Members: Ali Safavi, Loi Huynh, and Dr. Mehrdad H. Zadeh

The main goal of this project is providing haptic guidance in order to improve surgical performance for minimally invasive surgery training and developing frameworks to improve such systems. The latest project involves a model-predictive haptic guidance approach that automatically learns a task from the demonstration of an expert based on Robot Learning from Demonstration (LfD) and applying guided force by an hybrid Model-predictive Controller (MPC) and Artificial Neural Network (ANNs) framework, manage human uncertainty in such human-robot interaction. The LfD method has been modeled with the help of hidden Markov models (HMM) and the results of the project so far have proven promising. The future of the project is leaning more towards applying deep learning methods for the LfD part and extends the framework to more complicated tasks and also robotic surgery.

Return To Project List

2. Real-time Haptic Guidance for Minimally Invasive Surgery (MIS) Training Simulation

Members: Hadi Rahmat-Khah, Ehsan Zahedi, Loi Huynh, Dr. Javad Dargahi, and Dr. Mehrdad H. Zadeh

In this project, a gesture-based haptic guidance approach for physical Human-Robot Interaction (pHRI) in a haptic-enabled Minimally Invasive Surgery (MIS) training environment has been developed. In this approach, a virtual MIS task has been segmented into primitive gestures and reference hidden Markov models (HMMs) have been trained according to these gestures. Applied forces have adaptively been calculated in real time with respect to gestural differences between user motions and the reference models. Our approach is partially task independent and robust to spatial variation of gestures due to the HMM-based segmentation scheme. Our experimental results show that the variable impedance control can effectively improve user performance in the presence of human behavior uncertainty during a pHRI task. This model-based haptic guidance could be extended to provide haptic guidance in real-time skill assessment, sport training, and rehabilitation exercises.

Return To Project List

3. Haptic-Enabled Mobile Robotics: Acceleration Based Viarable Force Feedback in Obstacle-Avoidance Task

Members: Kalanathlalith Gunturu, Nicolas Cramer, David Racine, & Mehrdad Zadeh.

Obstacle avoidance is a very important task in the navigation of mobile robotics. Here we deal with human in the loop as the robot is controlled by a tele-operator. The lack of perception of the environment reduces the effectiveness of the operator. To increase the awareness of the operator, haptic effects are routinely added. We investigate the effects of using force feedback steering wheel and haptic pedal in an obstacle avoidance task. The inclusion of steering wheel and gas pedal in to this system is for the more ergonomic feasibility standpoint. Also, the integration of haptic effects on steering wheel and gas pedal augment the driving task. Results show that haptic feedback assisted in maneuvering the robot in obstacle avoidance task.

Return To Project List

4. Providing Haptic Feedback in Robot-Assisted Minimally Invasive Surgery: A Direct Optical Force Sensing Solution for Haptic Rendering of Deformable Bodies

Members: Shervin Ehrampoosh, Reza Yousefian, Mohit Dave, Mohamed Elnaggar, Jacob Nangle, Raniel Ornelas, Pedro Henrique Affonso, Garret Kottmann, Dr. Micheal Kia, and Dr. Mehrdad Zadeh.

Motivation: Advances in minimally invasive surgery (MIS) has considerably progressed in the recent years. However, certain limitations still exist. A major shortcoming of MIS, and one that has been the subject of this research, is the lack of sensory information from the operative field available to the surgeon, resulting in a reduced access between the surgeon and the tissue. As a result, force sensing could lead to expand safety and diminished intraoperative time, and it could help less experienced surgeons to do complex surgeries with less practice.

Method: In this study, a bilateral master-slave tele-manipulation is presented with parallel force-position control architecture, as shown in Fig.2. In this method, two Phantom Omnis are directly assigned as master-slave devices to connect the human and environment commands into the teleoperation system. In the slave side, a proposed optical force sensor designed and prototyped to provide direct contact force measurement for the interaction forces between the environment and the slave device. Three optical fibers transmit and receive light while the reflector is axially moved. The system architecture is based on position error (PEB) and direct force reflection (DFR).

Block diagram of the parallel force/position architecture.

Return To Project List

Result: Two experiments were conducted to examine the potential effects of force feedback in the proposed MIS teleoperation system. The equipment including three deformable objects like: a sponge, suturing pad, and a model organ as high, medium, and low deformable objects, as well as a box which is used for surgeons to practice minimally invasive surgery. The contents of the box are obstructed to allow for testing just the effect of haptic feedback

Return To Project List

Experimental Master-slave setup for the tele-operation system.

First, twenty subjects were randomly chosen to feel three materials which were put in a surgical box, using a laparoscopic tool directly. The purpose of this experiment was to define how accurate people detect deformability of the objects to assess it as a base data in teleoperation. After that, another ten subjects were randomly chosen to run the master-slave system to feel the objects. For each material, a different value of PD controller would change the type of feedback allowing the user to distinguish each object. This experiment was done with and without haptic feedback. By comparing the results, the values for PD controller are exactly matched with the direct measurement.

5. Automated Ligation Device for Laparoscopic Surgery

Members: Reza Yousefian, Paul Jones, Christopher Barnard, Lucas Bell, Daniel Gudorf, Shervin Ehrampoosh, & Mehrdad Zadeh.

The objective of this project is to occlude a vessel using a suture and repeat the process several times. This device is needed because the only alternative method which is using staples has multiple disadvantages such as taking up a great deal of space behind the vessel. In order for a laparoscopic stapling device to work the surgeon needs to move the tool several millimeters behind the vessel. Many vessels do not have the required space behind them. This forces surgeons to pull on the vessel, which could damage it, and force the device past it. Our device will not require as much space behind the vessel, so it will reduce the risk of damage. Also, when using a laparoscopic stapling device it is possible that the vessel will move beyond the staple while it is being clamped down upon by the stapling jaw. (More information is confidential because of the final report patent due date)

Return To Project List