NEURODEVELOPMENTAL DISORDERS ACHIEVEMENTS

Virtual Reality (VR) technology has rapidly gained traction as a tool for both assessment and intervention in neurodevelopmental disorders, especially in autism spectrum disorder (ASD). In recent years, numerous VR applications have been developed to address the needs of individuals with ASD, making VR “one of the most suitable tools to address the psychological needs” of this population. Immersive VR environments offer safe, controlled, and customisable settings where individuals can practice social and communication skills, undergo therapeutic exercises, or be evaluated in scenarios resembling real life. Traditional ASD interventions using VR have mainly focused on skills training – for example, improving social cognition or reducing phobias through exposure therapy – and have shown promising outcomes in enhancing social interaction and reducing anxiety in children with autism. VR has also been explored as an assessment aid, providing semi-structured tasks in virtual settings that mirror situations used in conventional evaluations. Overall, the convergence of VR and autism research has opened new avenues for creating engaging experiences that can both measure and improve core deficits in ASD in ways not possible with standard methods.

Prof. Mariano Alcañiz and his research team have been at the forefront of integrating VR with neuroscience and artificial intelligence to characterize neurodevelopmental conditions better. As director of LabLENI (Laboratory for Immersive Neurotechnologies), he has spearheaded multidisciplinary projects leveraging VR for diagnostic assessment and therapeutic intervention in ASD. One of his notable efforts is the T-ROOM project (2017–2019), which created a virtual immersive environment for assessing and training children with ASD. In T-ROOM, children interact with virtual scenarios enriched with multisensory stimuli (visual, auditory, and even olfactory cues) to simulate real-world social situations in a controlled setting. Through such projects, Prof. Alcañiz’s team demonstrated how VR can serve as a platform to engage autistic children and record their responses in ways that traditional clinics cannot.

A hallmark of Prof. Alcañiz’s work is the integration of biometric and behavioural sensing into VR-based tasks to obtain objective measures of a child’s reactions. For instance, in a study published in 2020, his group used a VR environment (a virtual forest and city) equipped with olfactory, visual, and auditory stimuli to examine sensory processing differences in children with ASD. Children wore physiological sensors – such as electrodermal activity (EDA) wristbands – while experiencing these virtual scenes. The recorded responses were analyzed with machine learning to see if autistic children could be distinguished from typically developing children based on their implicit physiological reactions. The results were striking: patterns of EDA changes in response to combined sensory stimuli in the VR setting allowed the team to classify ASD vs. typical development with up to 90% accuracy. According to the authors, these findings indicate that implicit measures (like autonomic arousal captured via EDA) in ecologically valid VR settings can serve as reliable quantitative indicators of ASD, complementing or enhancing traditional clinical assessments. This approach contrasts with conventional diagnostic methods (e.g., observational interviews and rating scales) by providing objective data on how a child’s nervous system reacts to realistic social and sensory challenges.

Beyond sensory responses, Prof. Alcañiz’s team has also investigated motor and attentional behaviours in VR as potential autism markers. In recent work, they tracked children’s full-body movements while the children performed goal-directed actions in a virtual playground. Subtle motor abnormalities – such as atypical coordination or use of body parts during certain tasks – were identified in the ASD group that were not evident in typical peers. These differences, measurable through motion capture in VR, deepen the objective assessment of ASD motor skills and provide insight into visuomotor coordination deficits that may not be obvious through standard observation. Similarly, the team has experimented with eye-tracking in VR: by having children engage with virtual characters and measuring gaze patterns, they have shown that eye gaze data combined with machine learning can distinguish ASD children with high accuracy (on the order of ~80–90% in preliminary studies). Across these contributions, Prof. Alcañiz has consistently pushed the idea that VR is more than just a therapeutic medium – it is a rich data collection environment where a child’s moment-to-moment behaviours (physiological responses, gaze, gestures, decisions) can be objectively recorded and analyzed as indicators of their neurocognitive profile.

Another significant contribution is Prof. Alcañiz’s development of adaptive VR interventions guided by biomarkers. Recognizing that each autistic individual is unique, his lab has worked on systems that assess VR behaviour and adjust the VR environment in real-time to the individual’s needs. For example, the ongoing ADAPTEA project aims to create an adaptive VR therapy for ASD where the difficulty or stimuli are modulated based on the child’s biometric responses. In a presentation at an international congress on neurodevelopment, Prof. Alcañiz described “adaptive interventions for ASD using biomarker-guided virtual reality stimulation,” highlighting how VR exercises could dynamically change (e.g., introduce a calming scene if stress biomarkers rise) to personalize therapy. This line of work effectively closes the loop between assessment and intervention: biomarkers extracted from the child’s behaviour in VR inform the therapeutic content, making the intervention more responsive and potentially more effective for that child. Such contributions place Prof. Alcañiz among the pioneers who use VR to observe neurodevelopmental differences and act on them in a tailored way, moving towards precision digital therapeutics for autism.

XR-Based Behavioral Biomarkers (XRBB)

A cornerstone of Prof. Alcañiz’s recent work is introducing the XR-based Behavioral Biomarkers (XRBB) concept. XRBB refers to objectively measurable behavioral or physiological characteristics obtained within an eXtended Reality (XR) environment that indicate an individual’s cognitive or neurological condition. In simpler terms, XRBB is a biomarker of behaviour elicited and captured through immersive technologies like VR (and augmented reality), which can be used to identify or monitor neurodevelopmental disorders. Prof. Alcañiz formally introduced this concept around 2020 as “behavioural biomarkers based on VR” (initially termed VRBB) in the context of autism research (). The motivation behind XRBB is to overcome the limitations of traditional assessments by using XR to present real-life scenarios under controlled laboratory conditions, thereby eliciting natural behaviours that can be quantitatively measured.

In an XRBB paradigm, a person (for example, a child being evaluated for ASD) is immersed in a lifelike virtual situation – such as a playground interaction or a classroom scene – while various sensors and tracking tools record their responses. These responses can include physiological signals (heart rate, skin conductance, brain activity), behavioural signals (eye movements, facial expressions, body posture), and performance on tasks (reaction times, decisions made in the virtual scenario). The key is that the XR environment provides standardized yet realistic stimuli (“complex social conditions replicas” ()), and the data collected are implicit – the individual is not explicitly answering questions or being aware of what is being measured, reducing biases like social desirability or examiner influence. Prof. Alcañiz argues that applying advanced computational analysis (such as machine learning) to these rich data streams can derive biomarkers that objectively characterize cognitive or behavioural profiles in ways that traditional tests cannot. In the case of autism, an XRBB might be a distinctive pattern of gaze avoidance in a virtual social greeting scenario or an atypical spike in stress (EDA) when confronted with an unexpected change in the virtual environment – each of which could serve as a digital signature of the disorder’s presence or severity.

The concept of XRBB is significant because it represents a fusion of extended reality with neuroscience and AI to create a new class of assessment tools. Unlike paper-and-pencil tests or one-time observations, XRBB can continuously monitor behaviour in scenarios that feel real to the participant but are fully controllable by the researcher/clinician. This opens the door to more sensitive and granular measurements of neurodevelopmental differences. In Prof. Alcañiz’s words, XRBB allows a “computational psychiatry paradigm based on implicit brain processes measured through psychophysiological signals and behaviour of subjects exposed to complex replicas of social conditions using virtual reality interfaces”. Essentially, XRBB operationalizes the vision of computational psychiatry for ASD. Instead of relying solely on subjective impressions, it uses data-driven biomarkers gathered in immersive simulations to classify and understand disorders. The introduction of XRBB has been a novel conceptual leap, framing VR/AR as engaging tools and scientific instruments that can capture the hidden facets of human cognition and behavior. Prof. Alcañiz and colleagues have demonstrated XRBB in practice by identifying multiple candidate biomarkers for ASD (sensory hyperarousal patterns, gaze anomalies, motor irregularities) using XR setups, thereby validating the concept’s utility. The XRBB framework is now influencing how researchers think about digital biomarkers in neurodevelopment – shifting focus towards immersive, behaviorally-grounded metrics rather than only genetic or neural correlates.

Novelty and Impact

Prof. Alcañiz’s XRBB approach is an innovative advancement compared to prior VR-based methods for assessing and training neurodevelopmental disorders. Earlier VR interventions for ASD were largely therapeutic or educational in nature – for example, role-playing social conversations with virtual characters, practising vocational skills in a simulated workplace, or teaching emotion recognition through gamified VR scenarios. While such interventions are valuable, they typically measure progress in coarse terms (completion of tasks, improvements in skill performance) and rely on pre-defined outcomes (e.g. did the child learn the skill?). In contrast, the XRBB approach transforms the VR session into a rich assessment battery, continuously mining the user’s spontaneous reactions for diagnostic clues. This is a novel shift from using VR as a training device to using VR as a precision measurement tool. Few studies before Alcañiz’s work had leveraged VR in this data-driven way: as noted in one review, “most of the work [with VR and ASD] has used virtual reality for the learning objectives of interventions. Very few studies have used biological signals for detailed analysis of behavioural responses that can be used to monitor or produce changes over time.” (). Prof. Alcañiz filled this gap by marrying VR with biosensors and AI analytics, effectively introducing bio-behavioural monitoring into VR environments for ASD. This integration was not standard in the field previously, marking his approach as a clear break from the more scripted, therapy-focused VR applications of the 2010s.

Another aspect of novelty is the multimodal nature of XRBB. Some earlier works used individual modalities (for instance, researchers explored eye-tracking or EEG in isolation as potential autism biomarkers). Still, Prof. Alcañiz’s projects combine several streams within the same XR platform: physiological arousal, gaze, movement, voice, and performance. The T-Eye project, for example, is described as a “pioneering” system that unites VR with intelligence artificial and biomarkers, where a child in a virtual park wears an EDA wristband and eye-tracking glasses, and their body movements and even voice patterns are analyzed by AI. This holistic behaviour capture is unique and goes beyond what other VR assessment tools offer. It mirrors a real clinical examination (where a clinician would simultaneously observe eye contact, body language, stress signs, etc.) but accomplishes it with objective sensors and algorithms. By designing VR setups that are biometrically instrumented, Prof. Alcañiz’s approach yields a breadth of data that singular-modality approaches lack, thereby increasing the robustness and richness of the biomarkers identified. The novelty also lies in applying machine learning classification to these behavioural datasets: Alcañiz’s 2020 study was among the first to report high-accuracy autism discrimination using VR-induced responses and ML models, showcasing the potential of computational models to interpret complex behavioural data from XR settings.

The impact of these contributions is multifaceted. Scientifically, introducing XRBB has pushed researchers to rethink how neurodevelopmental disorders can be studied – shifting some focus from static tests and expensive neuroimaging towards interactive, ecologically valid evaluations. It complements large-scale efforts to find ASD biomarkers (such as genetic and neuroimaging studies) by providing a behavior-centered approach more directly related to functional abilities in real-world contexts. Clinically, if validated at scale, XRBB-based systems could transform early diagnosis and personalized intervention for autism. For instance, an XRBB assessment might detect subtle signs of ASD in a 3-year-old by analyzing their gaze and stress responses in a virtual playroom. It enables earlier diagnosis than standard methods, which rely on observable behavioural symptoms that might fully manifest later. The Spanish pilot studies by Alcañiz’s team already indicate that such a system can identify ASD with ~90% accuracy in young children,  a level of performance that could significantly aid clinicians if replicated broadly. Moreover, XRBB can guide interventions: biomarkers obtained can stratify children by specific sensory or social profiles, allowing therapies to be tailored to those profiles (an approach aligned with the precision medicine trend). Prof. Alcañiz’s concept of “biomarker-guided VR stimulation” (as in the ADAPTEA project underscores the impact on intervention design – VR exercises can be adjusted in difficulty or modality based on a child’s biomarker responses, making therapy more adaptive than one-size-fits-all programs used previously.

Compared to other VR-based methods, what makes Alcañiz’s approach unique is this closed-loop integration of assessment and training via biomarkers. Other researchers have certainly used VR for ASD, but mainly in an open-loop manner (deliver intervention, then assess outcomes separately). Alcañiz’s XRBB framework blurs that line: assessment is continuous during the intervention, and the data immediately feeds back into optimizing the therapy. This is a novel paradigm with a potentially significant impact – it could shorten the feedback cycle in therapy (the system “knows” if the child is overstimulated or under-engaged and can adjust instantly) and provide objective progress metrics over time (biomarker trends can show improvement in, say, social anxiety or attention). The XRBB approach has a cross-disciplinary impact: it exemplifies the convergence of extended reality, AI, and cognitive neuroscience. The concept has been extended beyond autism into other domains (for example, Alcañiz has discussed XRBB for characterizing general human cognition and leadership skills, showing that the method is broadly applicable. In the neurodevelopmental context, his pioneering work models how immersive technologies can yield new diagnostic tools. It has inspired follow-up research in “virtual biomarkers” and received international attention (e.g. through keynotes and conference talks on XRBB. The ultimate impact is a move toward assessments that are more objective, earlier, and more engaging for patients – a legacy of innovation that Prof. Alcañiz’s contributions are cementing in autism research.

Conclusion

Prof. Mariano Alcañiz has made significant and forward-looking contributions to using virtual and extended reality in understanding and managing neurodevelopmental disorders, especially autism. He has elevated VR from a promising idea to a sophisticated research and clinical tool by introducing the concept of XR-based Behavioral Biomarkers and demonstrating its feasibility. In summary, his key contributions include developing immersive VR environments (like T-ROOM) tailored for children with ASD; integrating multimodal sensors into these environments to capture hidden indicators of neurodevelopmental function; applying machine learning to these data to uncover quantitative behavioural biomarkers that distinguish or characterize individuals with ASD; and using those biomarkers to drive personalized, adaptive interventions (as seen in projects like ADAPTEA). He essentially built a bridge between experimental psychology, clinical diagnostics, and cutting-edge XR technology, enabling experimentally rigorous and ecologically valid assessments.

The introduction of XRBB stands out as a novel paradigm that frames behaviour in virtual worlds as data for precision medicine. This novelty has been contrasted against earlier VR methods and shown to add unique value in objectivity and adaptability. The impact of Alcañiz’s work is evident in both research and practical contexts: his studies have provided proof of concept that VR-induced responses (such as physiological arousal or gaze patterns) can serve as reliable biomarkers for ASD, and ongoing trials suggest these methods can improve early detection rates and intervention personalization in clinical settings. As the field progresses, Prof. Alcañiz’s contributions lay a foundation for future directions such as validating XRBB tools with more significant diverse populations, integrating additional modalities (e.g. neuroimaging or speech analysis) into the VR frameworks, and extending these approaches to other neurodevelopmental disorders like ADHD or social anxiety where XR environments could probe specific deficits. Moreover, his vision hints at a future where routine developmental assessments might involve playful VR sessions that simultaneously diagnose and train, guided by AI – a future in which his XRBB concept would play a central role. In conclusion, Prof. Alcañiz has not only introduced a new concept in the form of XRBB but has also exemplified how marrying extended reality with biometrics and AI can revolutionize both the assessment and treatment of neurodevelopmental disorders, with autism as a prime example of this paradigm shift.

Some related papers

Alcañiz, M., Chicchi‐Giglioli, I. A., Carrasco‐Ribelles, L. A., Marín‐Morales, J., Minissi, M. E., Teruel‐García, G., … & Abad, L. (2022). Eye gaze as a biomarker in the recognition of autism spectrum disorder using virtual reality and machine learning: A proof of concept for diagnosis. Autism Research15(1), 131-145.

Alcañiz Raya, M., Chicchi Giglioli, I. A., Marín-Morales, J., Higuera-Trujillo, J. L., Olmos, E., Minissi, M. E., … & Abad, L. (2020). Application of supervised machine learning for behavioral biomarkers of autism spectrum disorder based on electrodermal activity and virtual reality. Frontiers in human neuroscience14, 90.

Minissi, M. E., Altozano, A., Marín-Morales, J., Giglioli, I. A. C., Mantovani, F., & Alcañiz, M. (2024). Biosignal comparison for autism assessment using machine learning models and virtual reality. Computers in Biology and Medicine171, 108194.

Minissi, M. E., Chicchi Giglioli, I. A., Mantovani, F., & Alcaniz Raya, M. (2022). Assessment of the autism spectrum disorder based on machine learning and social visual attention: A systematic review. Journal of Autism and Developmental Disorders52(5), 2187-2202.

Alcaniz Raya, M., Marín-Morales, J., Minissi, M. E., Teruel Garcia, G., Abad, L., & Chicchi Giglioli, I. A. (2020). Machine learning and virtual reality on body movements’ behaviors to classify children with autism spectrum disorder. Journal of clinical medicine9(5), 1260.

Maddalon, L., Minissi, M. E., Parsons, T., Hervas, A., & Alcaniz, M. (2024). Exploring adaptive virtual reality systems used in interventions for children with autism spectrum disorder: systematic review. Journal of medical Internet research26, e57093.

Minissi, M. E., Landini, G. A. R., Maddalon, L., Torres, S. C., Giglioli, I. A. C., Sirera, M., … & Alcañiz, M. (2023, August). Virtual reality-based serious games to improve motor learning in children with autism spectrum disorder: An exploratory study. In 2023 IEEE 11th International Conference on Serious Games and Applications for Health (SeGAH) (pp. 1-6). IEEE.

Minissi, M. E., Gómez-Zaragozá, L., Marín-Morales, J., Mantovani, F., Sirera, M., Abad, L., … & Alcañiz, M. (2023). The whole-body motor skills of children with autism spectrum disorder taking goal-directed actions in virtual reality. Frontiers in Psychology14, 1140731.

Altozano, A., Minissi, M. E., Maddalon, L., Alcañiz, M., & Marín-Morales, J. (2024, June). Exploring Autism Assessment Through Parental Open-Ended Questionnaires. In Decision Science Alliance International Summer Conference (pp. 210-217). Cham: Springer Nature Switzerland.

Maddalon, L., Minissi, M. E., Parsons, T. D., Hervás Zúñiga, A., & Alcañiz Raya, M. Adaptive virtual reality systems employed for children with autism spectrum disorder interventions: a systematic review. J Med Internet Res. Forthcoming. doi: https://doi. org/10.2196/57093.

Altozano, A., Minissi, M. E., Alcañiz, M., & Marín-Morales, J. (2025). Introducing 3DCNN ResNets for ASD full-body kinematic assessment: A comparison with hand-crafted features. Expert Systems with Applications270, 126295.