Professor Rowan University Cherry Hill, New Jersey, United States
Introduction: Surgical standards and training are achieved through numerous hours of practice and failure using imitated models under the supervision of a limited number of highly specialized surgeons [1]. This can be very time-consuming and costly for future specialized surgeons to attain the experience needed for operational proficiency. Virtual Reality (VR) simulators allow residents and skilled surgeons alike to learn new complex surgical procedures when they best fit their schedules and grant them the opportunity to perfect their skills through failure with low-risk [2]. This more flexible training regimen has been translated to surgical evaluation improvements such as shortened surgical times, greater tool manipulability, and greater accuracy in the operating room [3]. Motivated by these remarks, we aim to develop the Robossis Surgical Simulator (RSS) to provide the surgeon(s) and operating staff with the required skills to operate the Robossis system.
Materials and
Methods: We design and develop the RSS to inherit the Robossis system surgical environment as previously completed in a cadaver experiment [4]. Further, the RSS is developed to immerse the users in a 3D environment utilizing the Meta Quest VR headset and experience haptic feedback via the Sigma.7 haptic controller (HC). We use the Sigma.7 HC as the interface between the user input trajectories and speed into the RSS environment for manipulating the Robossis surgical robot (RSR). A motion control algorithm was developed to scale the user input trajectories to a maximum linear and angular velocity to represent conditions similar to the real world. Additionally, we developed the kinematic representation of the RSR and HC to resemble the real-world physical systems. We compute the inverse kinematics to drive the RSR and HC joints so that each robot's end-effector is manipulated to the desired location and orientation. Further, we extended the implementation of the separating axis theorem to retrieve the collision between the distal and proximal bone segment and, hence, determine the required haptic feedback that restricts the bone-bone collision.
Results, Conclusions, and Discussions: The RSS environment was created to immerse the trained users in a realistic operating room environment for femur fracture surgery using the Robossis system (Fig. 1). To interact with the environment, the HC Sigma-7 was used to manipulate the distal bone segment to the desired translational and rotational directions. Validation and simulation testing were conducted to determine the deviation of the RSR from the motion of the user's hand via the HC. The results show that the user's hand motion was accurately translated to the RSR movement, with a maximum deviation for translation (~ 5 mm) and rotation (~ 0.6 deg) (Fig. 2). Additionally, the integration of haptic feedback into the RSS provides users with the virtual representation and collision of the bone segments during the training (Fig. 3). Thus, realistic behavior is experienced during training on the simulator. To sum up, we developed a haptic-enhanced VR simulator designed explicitly for the Robossis system femur fracture procedure. This advancement in training not only enhances the learning curve but also promises to elevate the standard of medical education and patient care.
Acknowledgements (Optional): References 1. M. P. Rogers. et al. Surgery. 2021;169(5):1250–1252. 2. A. J. Lungu, et al. Expert Rev Med Devices, 2021;18(1):47–62. 3. M. P. Fried et al., Ot—He and Ne Su, 2010;142(2):202–207. 4. M. S. S. et al., IEEE Robot Autom Lett, 2023. 8 (5), 2438-2445