For a very long time, ranging from early trepanations in Neolithic ages until little more than 100 years ago, the basic principles of medical praxis did not change significantly. The application of x-rays for gathering images from the body's interior marked a major milestone and introduced a paradigm change in the way humans understood and practiced medicine.
The revolution introduced by medical imaging is still evolving. After x-rays, several other modalities have been developed, giving us new, different, and more complete views of the body's interior: tomography (CT and MR) gives a precise anatomical view and allows localization in space, nuclear medicine gives pictures of metabolism, ultrasound and IR-imaging enable noninvasive imaging, to name only a few. All these magnificent innovations have one thing in common: they provide images as primary information, thus allowing us to literally see things and to capitalize from the unmatched capabilities of our visual system. Until recently, it has mainly been diagnostic imaging that has reaped the benefits from this technological revolution.
However, in recent times and with increasing economic pressure, challenging the efficiency of medical applications was a dominant development, with the view that health professionals do not desire better diagnosis alone if there is no impact on the therapy procedure. As a result, interventional procedures are gaining importance as compared with pure diagnostics and will become the driving force for the years to come. This new tendency has been called by various names, such as intraoperative imaging, image-guided therapy, surgical navigation, computer-aided treatment, VR/AR in medicine, and so on. All players have a diffuse feeling that they should regard these types of systems as parts of a bigger picture; however, the lack of a reference classification scheme so far creates a certain level of confusion among practitioners and users.
The ultimate goal of future developments must be on integrating all of these partial solutions in what we call a holistic and closed loop. Such an integrated reference system unifies particular aspects in one unique, integrated, ubiquitous, transparent scheme. Figure 1
serves as a classification giving a reference model of a holistic, closed loop of image-involving therapy systems, with an emphasis on minimally invasive procedures.
Figure 1. Classification scheme of an image-based therapy system.
The driving idea behind this research is that clinical procedures are not isolated, but interact with each other—especially that every module down the pipeline accesses information generated by another module upstream. It's already usual to perform preoperative image acquisition, (manual) organ segmentation and target definition, treatment planning, and diagnostic or interventional simulation. Some navigation systems are also available. During the intervention, organs typically can change position and/or shape. An advanced system must consider such changes intraoperatively by closing the loop, that is, reacquiring the new anatomy during the intervention and by adapting the initial plan to the new situation. Since the tasks will take place under stress situations, all steps necessary for completing the loop must be integrated and must perform robustly and autonomously: a system must automatically recognize and segment organs, adapt the original operation plan to the intraoperative registered morphology, update the navigation support accordingly, and present everything to the surgeon in a way that supports his or her tasks rather than interferes with them.
Within this special issue we focus on simulation systems. They are quite low in the pipeline described previously, therefore they possess a significant degree of complexity; at the same time they are technically rather underdeveloped compared to other parts of the pipeline. Simulators should create an immersive and realistic feeling similar to treating a real patient. They should provide realistic views, support haptic feedback, and enable instrument handling. In addition, they must simulate tissue deformations and/or tissue removal. Some simulators already exist for isolated, rather simple areas, such as endoscopic or minimally invasive procedures (laparoscopy, colonoscopy, bronchoscopy, and so on); however, there is a large need for systems that simulate open surgery (example, on the liver or heart). Simulators will play a fundamental role in teaching, training, and quality assurance for the coming generations.
We therefore distinguish two different simulator types: teaching and treatment planning. A teaching simulator enables tutoring and training of beginners without needing the physical presence of patients or instructors. Today's training mostly takes place either theoretically with such tools as books and phantoms (usually plastic copies of human organs), or later practically with direct patient involvement under instructor supervision. In a number of cases, training is suboptimal because adequate clinical cases are not always present. Simulators can help overcome this drawback by enabling conservation and reproduction of clinically relevant cases and by offering a realistic, standardized, and reproducible training environment. Instruction expertise can be incorporated in the training program. Teaching simulators employ static precollected and preedited data libraries with interesting medical cases.
Experienced doctors, on the other hand, would employ treatment-planning simulators. They offer the possibility of evaluating different treatment alternatives without having the hazard of maltreatment of the real patient. Thus, such simulators employ individual patient data originating directly from the patient on the table. As a consequence the questions of organ segmentation, organ behavior, automatic detection of risk areas, and so on need to be considered on the fly because the typical manual preprocessing step available for the preparation of cases with training simulator systems is not possible.
In this issue we try—within the limited space available—to illuminate the key aspects of simulators and to cover a range from basic simulation techniques up to the evaluation of available systems:
• Paloc et al. present an approach for real-time simulation of tissue cutting and deformation during surgical interventions. They discuss mass spring as well as finite-element simulations performed on structures with online manipulated topology.
• Reitinger et al. use VR technologies to plan resections of liver tumors. They have developed new interaction paradigms and immersive visualization techniques.
• Morris et al. present a comprehensive training system for bone drilling procedures. The VR simulation includes multimodal rendering as well as haptic tutoring.
• Basdogan et al. perform in vivo soft tissue characterization, and they transfer the results of these measurements to their real-time deformation algorithms.
• Lamata et al. develop a conceptual framework with the aim to analyze and evaluate laparoscopic VR simulators. They present an extensive overview to existing simulation systems and detail the results of their comparable studies.
We hope you will enjoy reading about these simulation systems, keeping in mind the complexity resulting from their location within a holistic and closed interaction loop.
is a head of department at the Fraunhofer Institute for Computer Graphics, Darmstadt, and is an adjunct professor of biomedical engineering at the National Technical University of Athens. His research interests include medical imaging with emphasis on therapy (visualization, VR/AR, navigation, simulation and training, and planning systems) and contemporary fields of image processing, such as 3D object reconstruction, content-based retrieval, and multimedia content analysis. Sakas has a PhD in computer graphics from the Technische Universität Darmstadt. Contact him at firstname.lastname@example.org.
leads the Medicine and Virtual Reality Group within the Virtual and Augmented Reality department at the Fraunhofer Institute for Computer Graphics. His research interests include haptic rendering, VR-assisted rehabilitation, and surgical simulation. Bockholt has a PhD from the Technische Universität Darmstadt. Contact him at Ulrich.Bockholt@igd.fhg.de.