EAES Academy

Create Guest Account Member Sign In
Controlling a Collaborative Robot with a Depth Camera using the IEEE 11073 SDC Communication Standard
EAES Academy. Berger J. 07/05/22; 366551; P294
Johann Berger
Johann Berger
Login now to access Regular content available to all registered users.
Abstract
Discussion Forum (0)
Rate & Comment (0)
Purpose

The benefits of robotic systems in image-guided interventions are widely known. Due to the high complexity of robotic workflows, the acceptance of active robots in the clinic still lacks behind. Natural user interaction principles and the possibility of easy system integration are the main concern in modern medical robotics. A variety of tools like position sensors and standardized communication protocols for medical devices was published in the last 5 - 10 years. Utilizing these tools efficiently can provide the missing link between user-friendly interaction and robotic precision. In this work, we present the integration of an established infrared position sensor with a collaborative robotic system for ultrasound-guided interventions, e.g., needle biopsies and focused ultrasound ablations. This setup shows the feasibility of a fast and intuitive integration of optical sensors with robotic systems for motion control procedures.

Methods and Materials

To integrate a Kinect V2 body sensor (MICROSOFT, USA) with a KUKA LBR iiwa 7 R800 robot (KUKA AG, Germany) both systems were implemented as IEEE 11073 SDC conform virtual medical devices to share position information and control rights via network. The Kinect inherent body-stream served to perform movements with a Clarius L7 wireless ultrasound device (Clarius Mobile Health Corp., Canada) between pre-defined target positions and the user's hand. An NDI Polaris Vega tracking camera (Northern Digital Inc, Canada) provided the shared coordinate system for an initial co-registration of both devices with a basic 3-point landmark approach. The system was tested by performing 20 movement repetitions for 2 different setup configurations.

Results

The system provides a stable and repeatable interaction between all co-registered devices while communicating via the SDC standard. In all performed commands the robotic system behaved as expected and the target positions provided by the Kinect sensor were reached successfully.

Conclusion

The implemented control system shows the feasibility of integrating different sensor devices in a standardized fashion. The setup of a Kinect V2, a Polaris Vega, and a KUKA robotic arm constitutes a proof of concept for straightforward augmentation of surgical robotics to provide novel interaction concepts. The presented system cannot support real-time critical events yet and must be further tested. The utilized landmark registration must be assessed to isolate possible accuracies and limitations. By providing a standardized method to integrate imaging robotics with position sensors, higher usability can be achieved for existing and new robots to perform, e.g., ultrasound-guided needle biopsies and focused ultrasound ablations.
Purpose

The benefits of robotic systems in image-guided interventions are widely known. Due to the high complexity of robotic workflows, the acceptance of active robots in the clinic still lacks behind. Natural user interaction principles and the possibility of easy system integration are the main concern in modern medical robotics. A variety of tools like position sensors and standardized communication protocols for medical devices was published in the last 5 - 10 years. Utilizing these tools efficiently can provide the missing link between user-friendly interaction and robotic precision. In this work, we present the integration of an established infrared position sensor with a collaborative robotic system for ultrasound-guided interventions, e.g., needle biopsies and focused ultrasound ablations. This setup shows the feasibility of a fast and intuitive integration of optical sensors with robotic systems for motion control procedures.

Methods and Materials

To integrate a Kinect V2 body sensor (MICROSOFT, USA) with a KUKA LBR iiwa 7 R800 robot (KUKA AG, Germany) both systems were implemented as IEEE 11073 SDC conform virtual medical devices to share position information and control rights via network. The Kinect inherent body-stream served to perform movements with a Clarius L7 wireless ultrasound device (Clarius Mobile Health Corp., Canada) between pre-defined target positions and the user's hand. An NDI Polaris Vega tracking camera (Northern Digital Inc, Canada) provided the shared coordinate system for an initial co-registration of both devices with a basic 3-point landmark approach. The system was tested by performing 20 movement repetitions for 2 different setup configurations.

Results

The system provides a stable and repeatable interaction between all co-registered devices while communicating via the SDC standard. In all performed commands the robotic system behaved as expected and the target positions provided by the Kinect sensor were reached successfully.

Conclusion

The implemented control system shows the feasibility of integrating different sensor devices in a standardized fashion. The setup of a Kinect V2, a Polaris Vega, and a KUKA robotic arm constitutes a proof of concept for straightforward augmentation of surgical robotics to provide novel interaction concepts. The presented system cannot support real-time critical events yet and must be further tested. The utilized landmark registration must be assessed to isolate possible accuracies and limitations. By providing a standardized method to integrate imaging robotics with position sensors, higher usability can be achieved for existing and new robots to perform, e.g., ultrasound-guided needle biopsies and focused ultrasound ablations.
Code of conduct/disclaimer available in General Terms & Conditions

By clicking “Accept Terms & all Cookies” or by continuing to browse, you agree to the storing of third-party cookies on your device to enhance your user experience and agree to the user terms and conditions of this learning management system (LMS).

Cookie Settings
Accept Terms & all Cookies