Recent advances at the intersection of virtual reality and neuroscience have catalysed a paradigm shift in how human cognition and behaviour are assessed and enhanced. Professor Mariano Alcañiz has been at the forefront of this shift, pioneering immersive neurotechnologies as a distinct field of research. Immersive neurotechnology merges extended reality (XR) with cognitive neuroscience and artificial intelligence to study the mind in realistic simulated environments. Alcañiz’s role as founding director of the Immersive Neurotechnologies Lab underscores the emergence of this new paradigm: his interdisciplinary team combines computer science, psychology, and neuroscience to explore human cognition in digital worlds. By leveraging XR not just as a tool but as an experimental medium, Alcañiz’s work departs from conventional methodologies and establishes immersive XR platforms as credible scientific instruments for neuroscience research. This approach has been described as opening “an open door to unexplored terrain” in brain science, highlighting how fundamentally it differs from prior paradigms.
Paradigm Shift to Immersive Neurotechnologies
Alcañiz’s contributions have helped define immersive neurotechnologies as a novel research domain. Traditional cognitive assessment and training relied on artificial tasks and explicit self-report measures, which often suffer from poor ecological validity and biases (e.g. social desirability). In contrast, immersive neurotechnology situates assessment within lifelike XR scenarios, enabling a more naturalistic evaluation of cognitive functions. By recreating complex real-life situations in virtual environments, researchers can evoke genuine cognitive and emotional responses under controlled laboratory conditions. This marks a paradigm shift from 20th-century methodologies toward experiential, context-rich experimentation. Alcañiz’s work exemplifies this shift: he integrates high-fidelity XR simulations with neuroscience techniques to capture implicit behavioural and psychophysiological data as people interact in realistic virtual settings. Such integration represents a new kind of ecologically grounded neuroscience, wherein immersive XR becomes both a stimulus and measurement device. The significance of this shift is reflected in the conceptualisation of XR-based Behavioral Biomarkers (XRBB) – quantifiable indicators of cognitive or affective processes extracted from behaviour and physiological signals in XR environments. The introduction of XRBB underlines the emergence of immersive neurotechnologies as a distinct field: it extends the biomarker concept from the biochemical realm into behavioural neuroscience conducted within XR. Alcañiz and colleagues argue that using XR with implicit neural measures allows the classification of human cognitive traits in ways not previously feasible. In summary, the paradigm shift championed by Alcañiz is characterised by moving from abstract, questionnaire-based assessments to immersive, data-rich XR paradigms – effectively blurring the line between experiment and experience to understand the human mind.
Integrating Extended Reality, Neuroscience, and AI
A cornerstone of Alcañiz’s research is the seamless integration of XR technology with neuroscience and artificial intelligence algorithms. In his framework, extended reality (encompassing virtual and augmented reality) provides the immersive context to engage users in tasks that tap into memory, attention, emotion, and other cognitive processes. Meanwhile, sensors and neurophysiological measures (e.g. eye tracking, galvanic skin response, EEG) record the user’s implicit responses – the physiological and behavioural signatures of their cognitive and emotional state during the XR experience. Alcañiz’s team can identify patterns corresponding to specific psychological states or traits by applying AI and machine learning techniques to these rich data streams. In other words, AI algorithms distil complex behavioural and biosignal data into XR-based behavioural biomarkers that characterise an individual’s cognitive or affective profile. This fundamentally multidisciplinary approach relies on computational methods to fuse psychology and neuroscience within XR. For example, a virtual social scenario might elicit stress or decision-making behaviours; AI models then analyse heart rate variability, gaze patterns, voice tone, and choices made by the participant to infer underlying traits such as anxiety reactivity or social cognition style. The outcome is a set of objective biomarkers reflecting the person’s cognitive-emotional processing in an ecologically valid setting. By integrating these elements, Alcañiz has created a computational neuropsychology paradigm: XR provides controlled yet lifelike stimuli, neuroscience offers measures for brain-body responses, and AI provides the analytical power to map those responses onto latent psychological dimensions. This fusion of XR, neuroscience, and AI is at the heart of immersive neurotechnologies, and it enables advancements like XRBB that were not achievable within any single discipline alone.
Limitations of Prior Approaches.
Alcañiz’s innovations arose in response to clear limitations in pre-existing approaches to training and intervention. Before the advent of immersive neurotechnology, two common strategies in applied psychology and rehabilitation were virtual environment interventions without personalisation and generic biofeedback techniques. We examine each in turn and highlight their shortcomings.
Generic Virtual Environments. Early virtual reality interventions often employed standardised or one-size-fits-all virtual environments for all users. Whether used for phobia exposure therapy, cognitive training, or skills practice, these virtual scenarios were typically static – every participant experienced the same content regardless of their individual needs. This generic approach neglected the vast heterogeneity in users’ cognitive capacities, symptoms, or learning styles. Consequently, its efficacy was limited: a scenario that might be appropriate for one person could be ineffective or even counter-productive for another. Recent reviews of VR interventions underscore this limitation. For instance, in the context of autism spectrum disorder (ASD) therapy, researchers note the heterogeneity of symptoms among patients and emphasise “the need for an early customised intervention that targets specific symptom configurations for each individual”. Generic virtual training that does not adapt to individual differences fails to engage users or fully transfer to real-world improvements. More broadly, traditional VR systems lacked mechanisms to adjust task difficulty or context based on a user’s moment-to-moment performance or emotional state. The result was often a mismatch between the virtual challenge and the user’s capabilities, leading to suboptimal training efficiency. This shortfall motivated the development of adaptive virtual environments, wherein the system can personalise scenarios on the fly – a capability notably absent in the first generation of VR applications.
Biofeedback Without Cognitive-Emotional Personalization. In parallel with VR interventions, biofeedback and neurofeedback techniques have long been used to help individuals regulate their physiology (e.g. heart rate, muscle tension, brain waves) for therapeutic benefit. However, traditional biofeedback systems typically provide generic feedback (such as a visual or auditory signal indicating stress level) and expect the user to self-regulate without tailoring the process to the user’s unique cognitive appraisals or emotional triggers. These systems treat physiological signals in isolation, decoupled from the person’s psychological context. A significant limitation of this approach is the lack of personalisation: two individuals with identical heart rate readings may have very different emotional experiences or cognitive interpretations, yet a generic biofeedback display would treat them equivalently. The literature identifies “patient response variability” and lack of standardised personalisation as key challenges that hinder the effectiveness of classical digital therapeutics. In essence, earlier biofeedback paradigms did not leverage cognitive neuroscience knowledge. For example, they did not use anxiety, attention, or emotion regulation models to inform what the physiological signals meant for each user. As a result, the feedback often remained superficial (e.g. “lower your heart rate”) and did not directly address the person’s internal cognitive-emotional state. This limitation reduces user engagement and the transfer of learned regulation skills to real-life contexts. It has become evident that overcoming these shortcomings “requires a paradigm shift” toward interventions incorporating emotional-cognitive support and adapting to the individual’s mental state in real-time. Alcañiz’s research agenda explicitly tackles this need for personalisation, moving beyond simplistic biofeedback by using neuroscience-guided biomarkers to capture how each user is processing the virtual experience at a psychological level.
Cognitive Neuroscience Models for Personalized Biomarkers
Professor Alcañiz’s approach directly addresses the above limitations by grounding the design of virtual environments and the interpretation of user data in cognitive neuroscience models. Rather than using ad-hoc or purely symptom-driven adaptation, his methodology draws on theoretical models of cognitive processes (like attention, memory, and executive function) and emotional processes. These models inform the creation of XR scenarios intended to probe specific mental functions. For example, a virtual environment may be structured around a complex social situation to concurrently engage social cognition and executive decision-making. During these scenarios, Alcañiz’s team measures many behavioural and physiological responses – from decision reaction times and gaze patterns to galvanic skin response (GSR) and voice tone. The critical innovation is that they interpret these measures through the lens of cognitive neuroscience: patterns in the data are mapped to latent cognitive or emotional traits. This is essentially an assessment of psychological dimensions using XR, enabled by biomarkers. In Alcañiz’s framework, the XR-based behavioural biomarkers (XRBB) extracted encapsulate individual differences in brain function during the task. Notably, these biomarkers are implicit – derived from unconscious or involuntary responses (e.g. physiological arousal, micro-behaviors) rather than what the person explicitly reports or consciously controls. As such, XRBB can reveal traits like impulsivity, stress reactivity, attentional capacity, or emotional resilience in a way that traditional explicit tests or generic biofeedback cannot. This approach leverages established cognitive neuroscience mappings (for instance, linking elevated heart rate variability with better emotion regulation or specific gaze patterns with attentional focus) to give meaning to the raw data collected. It effectively transforms low-level signals into high-level psychological insights. Alcañiz and colleagues have described this process as using “a neuroscientific paradigm based on implicit brain processes measured through psychophysiological signals and behaviour of subjects exposed to complex social conditions’ replication using virtual reality interfaces”. In simpler terms, the virtual scenario is designed according to a cognitive model, and the user’s reactions are decoded into biomarkers that reflect their cognitive-emotional state. By extracting these personalised biomarkers, the system understands the user’s unique cognitive and emotional profile – which can then guide the tailoring of the virtual environment itself.
Personalised Virtual Environments for Clinical and Performance Applications
A key outcome of Alcañiz’s XRBB approach is the ability to personalise virtual environments in previously unattainable ways. Armed with biomarkers that quantify an individual’s cognitive strengths, weaknesses, and emotional tendencies, the system can adjust the content and parameters of the XR experience to suit that individual. This personalisation is vital in both clinical and high-performance contexts. In clinical therapy or rehabilitation, for example, an XR system can detect a patient becoming overwhelmed (through stress biomarkers), dynamically reduce the intensity of the virtual stimuli, or recognise insufficient engagement and introduce new challenges. The environment can also target specific cognitive deficits: if biomarkers indicate a patient has difficulty with impulse control, the VR exercises can be configured to practice that skill more intensively. Such tailoring ensures that interventions address each patient’s particular needs rather than using generic exercises. This approach aligns with the trend toward personalised medicine, bringing it into the realm of mental health and neurorehabilitation. Indeed, recent research emphasises that adaptive systems “capable of dynamically customising interventions to individual physiological, cognitive, and emotional conditions” represent the next generation of non-pharmacological therapy. Alcañiz’s work operationalises this vision by connecting real-time biomarker analysis to therapeutic content adjustments.
In performance and training domains, the same principle applies. Consider the example of leadership training in organisational settings: Traditional leadership workshops might present all trainees with identical scenarios, but Alcañiz’s XR approach can personalise scenarios based on each trainee’s cognitive-emotional profile. His research has demonstrated that XR scenarios offer “significant assessment playgrounds” for complex skill sets like leadership, providing interactive, multisensory situations that require the application of various skills simultaneously. Within such a scenario, XRBB can assess attributes like decision-making style, stress management, and social interaction tendencies for a given individual. Using this information, the system can personalise the training—for instance, presenting tougher ethical dilemmas to a trainee who shows high-stress tolerance or providing more supportive feedback to a trainee whose biomarkers suggest anxiety. The result is a training environment tuned to optimise each individual’s growth. By leveraging biomarkers as a personalisation layer, the VR training becomes adaptively difficulty-scaled and context-specific, enhancing learning outcomes. Early evidence supports the efficacy of this approach; in VR-based cognitive training studies, systems that adapt task parameters based on user performance and psychophysiology have shown improved user engagement and skill transfer. Thus, whether in clinical rehabilitation or professional development, Alcañiz’s XRBB-driven personalisation enables virtual experiences to be user-centred, responding to the person’s needs rather than expecting the person to conform to a fixed program.
Closed-Loop Adaptive Systems
Central to Alcañiz’s innovation is implementing a closed-loop system that tightly couples real-time biomarker analysis with on-the-fly adjustments of the virtual environment. In this closed-loop architecture, the process can be summarised as follows: the user engages with the XR scenario and generates behavioural/physiological data; these data are continuously analysed (using AI algorithms and cognitive models) to update the user’s biomarker profile in real time; based on the latest biomarker readings, the system modifies the virtual environment’s parameters or narrative; the user then reacts to the updated environment, generating new data – and the cycle repeats. This dynamic feedback loop ensures that the intervention or training remains finely attuned to the user’s current state at every moment. The significance of such closed-loop adaptation cannot be overstated. It effectively creates an interactive dialogue between human and virtual systems, wherein the system “responds in real-time to [user] behaviours with accuracy beyond human observation”. Unlike an open-loop system (where the environment is pre-set and unchanging during the session), the closed-loop approach can prevent situations of excessive difficulty (which might frustrate or stress the user) or excessive ease (which fails to stimulate growth). For example, if a user’s cognitive workload biomarker indicates fatigue, the system might decide to pause or simplify the next task; if, instead, the biomarkers show the user is fully engaged and performing well, the system might introduce a more complex challenge to maintain an optimal difficulty level. In therapeutic contexts, this means the therapy is constantly personalised: the system can detect a spike in anxiety and immediately trigger a calming intervention or guide the user through a coping strategy within the virtual environment. This real-time responsiveness is supported by emerging research in adaptive VR. A recent systematic review of VR adaptive systems concluded that real-time, multimodal feedback driven by machine learning is a promising technique to enhance interventions, demonstrating measurable benefits in user outcomes. Moreover, adaptive algorithms can integrate multiple biosignals (heart rate, skin conductance, eye gaze, etc.) to infer complex mental states and adjust the environment more sensitively than any human therapist or trainer could. The closed-loop system essentially operationalises the vision of an AI co-facilitator that continuously tailors the experience – an idea supported by the broader trend of AI-driven biofeedback in healthcare, which has shown that “real-time physiological assessment and individualised adjustments…enhances treatment…while maximising long-term efficacy”. Alcañiz’s work provides a concrete instantiation of this concept within XR: his closed-loop immersive systems represent a new class of neurotechnology that marries sustained user monitoring with immediate environment modulation. This level of adaptivity is a defining feature of the immersive neurotechnologies paradigm and enables XR platforms to serve as both assessment and intervention tools simultaneously. By dynamically closing the loop, the system ensures that the virtual environment is not merely a passive stimulus but an active, evolving partner in the user’s cognitive journey.
Conclusion
Professor Mariano Alcañiz’s research epitomises a transformative shift toward immersive neurotechnologies, establishing this area as a distinct and influential field. By integrating extended reality, neuroscience, and artificial intelligence, his work has introduced XR-based behavioural biomarkers as a novel means to assess and personalise human cognitive-affective functioning. This approach overcomes the limitations of earlier one-size-fits-all virtual interventions and simplistic biofeedback methods by deeply personalising the user experience through biomarkers grounded in cognitive neuroscience models. The result is an intelligent, closed-loop XR system that continuously adapts to the individual – a paradigm of personalised neuro-immersion. Early studies and reviews provide converging evidence that such adaptive XR systems can significantly improve both clinical outcomes and performance training efficacy. Alcañiz’s work thus holds broad significance: it advances the theoretical understanding of human cognition in ecologically valid settings. It paves the way for next-generation therapeutic and training platforms tailored in real-time to each user. In an academic context, this represents a compelling convergence of technology and neuroscience that aligns with calls for more individualised, dynamically adjusted mental health and education interventions. As immersive neurotechnologies continue to mature, the paradigm established by Alcañiz will likely serve as a foundation for future research and development, illustrating how immersive XR environments can be harnessed as sophisticated tools for measuring the mind and enhancing human potential.
Some related papers
Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The past, present, and future of virtual and augmented reality research: a network and cluster analysis of the literature. Frontiers in psychology, 9, 2086.
Alcañiz, M., Chicchi‐Giglioli, I. A., Carrasco‐Ribelles, L. A., Marín‐Morales, J., Minissi, M. E., Teruel‐García, G., … & Abad, L. (2022). Eye gaze as a biomarker in the recognition of autism spectrum disorder using virtual reality and machine learning: A proof of concept for diagnosis. Autism Research, 15(1), 131-145.
Alcañiz Raya, M., Chicchi Giglioli, I. A., Marín-Morales, J., Higuera-Trujillo, J. L., Olmos, E., Minissi, M. E., … & Abad, L. (2020). Application of supervised machine learning for behavioral biomarkers of autism spectrum disorder based on electrodermal activity and virtual reality. Frontiers in human neuroscience, 14, 90.
Minissi, M. E., Altozano, A., Marín-Morales, J., Giglioli, I. A. C., Mantovani, F., & Alcañiz, M. (2024). Biosignal comparison for autism assessment using machine learning models and virtual reality. Computers in Biology and Medicine, 171, 108194.
Maddalon, L., Minissi, M. E., Parsons, T., Hervas, A., & Alcaniz, M. (2024). Exploring adaptive virtual reality systems used in interventions for children with autism spectrum disorder: systematic review. Journal of medical Internet research, 26, e57093.
Alcañiz, M., Parra, E., & Chicchi Giglioli, I. A. (2018). Virtual reality as an emerging methodology for leadership assessment and training. Frontiers in Psychology, 9, 1658.
Parra Vargas, E., Philip, J., Carrasco-Ribelles, L. A., Alice Chicchi Giglioli, I., Valenza, G., Marín-Morales, J., & Alcañiz Raya, M. (2023). The neurophysiological basis of leadership: a machine learning approach. Management Decision, 61(6), 1465-1484.
Parra, E., García Delgado, A., Carrasco-Ribelles, L. A., Chicchi Giglioli, I. A., Marín-Morales, J., Giglio, C., & Alcañiz Raya, M. (2022). Combining virtual reality and machine learning for leadership styles recognition. Frontiers in Psychology, 13, 864266.
Alcañiz, M., Bigné, E., & Guixeres, J. (2019). Virtual reality in marketing: a framework, review, and research agenda. Frontiers in psychology, 10, 1530.
Blasco-Arcas, L., Lee, H. H. M., Kastanakis, M. N., Alcañiz, M., & Reyes-Menendez, A. (2022). The role of consumer data in marketing: A research agenda. Journal of business research, 146, 436-452.