M. Allgaier, et al., "LiVRSono - Virtual Reality Training with Haptics for Intraoperative Ultrasound," in 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Sydney, Australia, 2023 pp. 980-989. doi: 10.1109/ISMAR59233.2023.00114 keywords: {training;solid modeling;ultrasonic imaging;three-dimensional displays;input devices;liver;surgery} Abstract: One of the biggest challenges in using ultrasound (US) is learning to create a spatial mental model of the interior of the scanned object based on the US image and the probe position. As intraoperative ultrasound (IOUS) cannot be easily trained on patients, we present LiVRSono, an immersive VR application to train this skill. The immersive environment, including an US simulation with patientspecific data as well as haptics to support hand-eye coordination, provides a realistic setting. Four clinically relevant training scenarios were identified based on the described learning goal and the workflow of IOUS for liver. The realism of the setting and the training scenarios were evaluated with eleven physicians, of which six participants are experts in IOUS for liver and five participants are potential users of the training system. The setting, handling of the US probe, and US image were considered realistic enough for the learning goal. Regarding the haptic feedback, a limitation is the restricted workspace of the input device. Three of the four training scenarios were rated as meaningful and effective. A pilot study regarding learning outcome shows positive results, especially with respect to confidence and perceived competence. Besides the drawbacks of the input device, our training system provides a realistic learning environment with meaningful scenarios to train the creation of a mental 3D model when performing IOUS. We also identified important improvements to the training scenarios to further enhance the training experience. url: https://doi.ieeecomputersociety.org/10.1109/ISMAR59233.2023.00114