CV SURGERY AND SURGERY SIMULATION

Early Innovations in 3D Medical Imaging and Segmentation

Prof. Mariano Alcañiz’s career began with groundbreaking work in 3D medical image segmentation and visualization in the 1990s, laying a foundation for today’s image-guided interventions. As early as 1997, he co-led the HIPERCIR project (“Segmentation and 3D Visualisation of Medical Images: an HPCN Demonstrator for Radiology”), which delivered one of the first high-performance platforms for interactive 3D visualization of patient scans.  In 1999, Alcañiz and colleagues introduced semi-automatic segmentation techniques ahead of their time. For example, they developed shape-constrained deformable models (“snakes”) for accurate prostate boundary detection in 3D medical images. Around the same time, they proposed a multiresolution segmentation method using morphological filters and 3D watershed algorithms to capture anatomical structures with minimal user input. These early publications demonstrated how combining prior anatomical knowledge with multiscale image analysis could yield precise 3D models of organs from CT/MRI data. This pioneering effort prefigured modern AI-driven segmentation. Notably, this work was accomplished on standard PCs of the era, exemplifying Alcañiz’s talent for making advanced imaging tools accessible with limited computing resources. Such innovations gave surgeons and radiologists new capabilities for visualizing tumours, bones, and vasculature in three dimensions, greatly enhancing pre-operative planning in an era when 2D slices were still the norm.

Advancements in Computer-Assisted Surgery and Image-Guided Interventions

Building on his imaging breakthroughs, Prof. Alcañiz moved into computer-assisted surgery (CAS), developing some of Spain’s first image-guided intervention systems. By 1999, he had co-authored an “interactive image-guided surgery” prototype for neurosurgery that harnessed high-performance computing on affordable workstations. This system – created in partnership with neurosurgeon José Barcia-Salorio – allowed surgeons to navigate and visualize a patient’s brain images during stereotactic procedures when surgical navigation was still in its infancy. The prototype anticipated later commercial neuronavigation platforms by demonstrating real-time alignment of preoperative 3D images with the patient’s anatomy during surgery. In the early 2000s, Alcañiz explored augmented reality (AR) to enhance surgical precision. His team designed an AR system for laparoscopic (“keyhole”) surgery that overlayed 3D virtual organs – derived from the patient’s CT/MR scans – onto the live video of the patient to guide optimal trocar placement. In a validation study, this AR guidance system achieved a placement accuracy within ~3 mm. This remarkable achievement was bringing imaging data into the operating theatre in real time. Prof. Alcañiz’s emphasis on rigorous validation (e.g. using phantoms and error measurement set a scientific standard for AR in surgery. His work in CAS has been translated into practice through continued collaborations with hospitals – for instance, partnering with Valencia’s La Fe Hospital on novel neurosurgical AR and monitoring techniques (Electrodes and virtual reality to decipher the functioning of the brain) – ensuring that these technologies address real clinical needs. By integrating engineering innovations with surgical workflows, Alcañiz significantly advanced image-guided intervention techniques in Europe, helping surgeons see the unseen – whether a deep brain target or an intra-abdominal organ – with greater confidence and accuracy.

Virtual Reality Surgical Simulators and Training Platforms

Prof. Alcañiz is perhaps most internationally recognized for his contributions to virtual reality (VR)–based surgical simulation and training. Anticipating the surgical education revolution, he developed some of the earliest high-fidelity surgical simulators using VR in the late 1990s and early 2000s. Notably, he was a driving force behind “GeRTiSS” (Generic Real-Time Surgery Simulation), a general simulator for minimally invasive surgery that he co-authored and presented in 2003. GeRTiSS was a second-generation VR trainer that allowed surgeons to practice on virtual patient organs (even using a specific patient’s CT data) in a realistic 3D environment. The simulator incorporated haptic interfaces with force feedback, so the trainee could feel resistance and textures while performing virtual laparoscopic maneuvers. This capability – providing a sense of touch in VR – was highly novel at the time and critical for surgical skill acquisition. The virtual scenes in GeRTiSS could be populated with actual patient anatomy (for patient-specific rehearsal) or synthetic organs exhibiting various pathologies. This flexibility anticipated today’s patient-specific surgical rehearsal systems. Prof. Alcañiz’s team demonstrated that such a simulator could closely replicate the look and feel of a real minimally invasive procedure, a milestone in surgical education technology. Indeed, a few years later, his group published a comprehensive survey that became a touchstone for the field, cited over 450 times. This influential 2004 review systematically compared the emerging techniques for simulating soft-tissue behaviour (mass-spring models, finite elements, etc.), highlighting their trade-offs in realism vs speed. By disseminating this knowledge, Alcañiz guided researchers worldwide in addressing the key challenge of realism in surgical simulators.

Beyond minimally invasive surgery, Prof. Alcañiz also extended VR simulation to other domains – including neurosurgical procedures and orthopaedic training – and incorporated these tools into surgical curricula. His vision always combined technical rigour with practical training value: every simulator was tested with surgeons and refined for educational effectiveness. Through LabLENI (his lab at UPV), Alcañiz has transferred these simulators to training centres, allowing young surgeons to hone skills in a risk-free environment. His team pioneered many elements – such as patient-specific virtual anatomy, force-feedback instruments, and even multi-user collaborative simulation – have since been adopted by commercial surgical simulators and are now standard in surgical residency programs. In sum, Prof. Alcañiz’s work in VR-based surgical simulation helped transform surgical training from the old master-apprentice model to a modern paradigm where critical skills can be practised and assessed in realistic virtual environments before ever touching a patient.

Translational Impact on Clinical Practice and Training

A hallmark of Prof. Alcañiz’s career is translating technological innovation into real-world clinical practice and training protocols. He has consistently bridged the gap between laboratory research and bedside application. For instance, the 3D imaging and planning tools from his early projects were adopted by surgeons for the pre-operative planning of complex cases. By the early 2000s, neurosurgery teams in Valencia used Alcañiz’s software prototypes to visualize brain tumours and plan safer surgical trajectories in 3D. This work informed the development of neuronavigation systems that became routine in operating rooms a few years later. Likewise, the VR surgical simulators developed under his guidance did not remain academic curiosities – they were tested in hospital settings with surgical residents. Trainees who used the simulators reported improved psychomotor skills and confidence, contributing to the growing evidence (now well-established) that simulation-based training improves surgical performance. After phantom validations, Prof. Alcañiz’s 2011 AR laparoscopic system was trialled by experienced surgeons to assist with trocar placement; such trials showed the potential to reduce errors in port positioning. Insights from these studies have fed into subsequent AR navigation systems used in operating theatres (for example, modern laparoscopic guided systems and liver surgery planners cite foundational work like Alcañiz’s in aligning virtual organ models with the patient in situ).

Some related papers

Alcañiz, M., Montserrat, C., Grau, V., Chinesta, F., Ramón, A., & Albalat, S. (1998). An advanced system for the simulation and planning of orthodontic treatment. Medical Image Analysis2(1), 61-77.

Juan, M. C. (1767). Outlining of the prostate using snakes with shape restrictions based on the wavelet transform (Doctoral Thesis: Dissertation). Pattern Recognition32(1999), 1781.

Roldan, P., Barcia-Salorio, J. L., Talamantes, F., Alcañiz, M., Grau, V., Monserrat, C., & Juan, C. (2000). Interactive image-guided surgery system with high-performance computing capabilities on low-cost workstations: A prototype. Stereotactic and Functional Neurosurgery72(2-4), 112-116.

Monserrat, C., Meier, U., Juan, M. C., Alcaniz, M., Knoll, C., Grau, V., … & Duval, C. (1999). A new approach for the real-time simulation of tissue deformations. Les Cahiers de Rhéologie16(3).

Benlloch, J. M., Alcañiz, M., Escat, B., Fernandez, M. M., Gimenez, M., Gomez, R., … & Vera, D. (2004). The gamma functional navigator. IEEE Transactions on Nuclear Science51(3), 682-689.

Grau, V., Raya, M. A., Monserrat, C., Juan, M. C., & Martı́-Bonmatı́, L. (2004). Hierarchical image segmentation using a correspondence with a tree model. Pattern Recognition37(1), 47-59.

Grau, V., Mewes, A. U. J., Alcaniz, M., Kikinis, R., & Warfield, S. K. (2004). Improved watershed transform for medical image segmentation using prior information. IEEE transactions on medical imaging23(4), 447-458.

Pithioux, M., López, O., Meier, U., Monserrat, C., Juan, M. C., & Alcaniz, M. (2005). ParSys: a new particle system for the introduction of on-line physical behaviour to three-dimensional synthetic objects. Computers & Graphics29(1), 135-144.

Meier, U., López, O., Monserrat, C., Juan, M. C., & Alcaniz, M. (2005). Real-time deformable models for surgery simulation: a survey. Computer methods and programs in biomedicine77(3), 183-197.

Functional navigator JB Baviera, MA Raya, F Martinez, V Colomer – US Patent App. 11/096,226, 2007

Naranjo, V., Lloréns, R., Alcañiz, M., & López-Mir, F. (2011). Metal artifact reduction in dental CT images using polar mathematical morphology. Computer methods and programs in biomedicine102(1), 64-74.

Lloréns, R., Naranjo, V., López, F., & Alcañiz, M. (2012). Jaw tissues segmentation in dental 3D CT images using fuzzy-connectedness and morphological processing. Computer methods and programs in biomedicine108(2), 832-843.

Monserrat, C., Rupérez, M. J., Alcaniz, M., & Mataix, J. (2014). Markerless monocular tracking system for guided external eye surgery. Computerized medical imaging and graphics38(8), 785-792.

Morales, S., Naranjo, V., & Alcañiz, M. (2015). Automatic Detection of Retinal Structures Based on Mathematical Morphology. In Frontiers of Medical Imaging (pp. 211-232).

Fuentes-Hurtado, F., Diego-Mas, J. A., Naranjo, V., & Alcaniz, M. (2019). Automatic classification of human facial features based on their appearance. PloS one14(1), e0211314.

Alcañiz, M., Montserrat, C., Grau, V., Chinesta, F., Ramón, A., & Albalat, S. (1998). An advanced system for the simulation and planning of orthodontic treatment. Medical Image Analysis2(1), 61-77.

Alcañiz, M., Chinesta, F., Albalat, S., Grau, V., & Monserrat, C. (2020). A new system for 3D planning of orthodontic treatment and 3D tooth movement simulation. In Computer Methods in Biomechanics and Biomedical Engineering 2 (pp. 645-654). CRC Press.

Alcañiz, P., Pérez, J., Gutiérrez, A., Barreiro, H., Villalobos, Á., Miraut, D., … & Otaduy, M. A. (2021). Soft-tissue simulation for computational planning of orthognathic surgery. Journal of Personalized Medicine11(10), 982.

Grau, V., Alcaniz, M., Juan, M. C., Monserrat, C., & Knoll, C. (2001). Automatic localization of cephalometric landmarks. Journal of Biomedical Informatics34(3), 146-156.