Integrating neuroscience, immersive technology, and organisational behavior has given rise to a new paradigm for evaluating and developing leadership skills. Traditional leadership assessments have relied on explicit measures (surveys, interviews) that often lack ecological validity and can be biased. In contrast, Prof. Mariano Alcañiz and his colleagues at LabLENI have pioneered using virtual reality (VR) and extended reality (XR) environments as advanced tools for leadership assessment and training, bringing objective neurological and behavioural metrics into organisational research. By recreating realistic workplace scenarios in VR with controlled conditions, their work allows leadership competencies to be observed in action under dynamic but repeatable circumstances – impossible with standard methods [1]. This immersive approach addresses a key challenge in leadership research: how to evoke real-life management situations in a laboratory setting. VR provides a safe yet lifelike “assessment playground” where complex social and decision-making situations can unfold, and leaders’ responses can be measured quantitatively with high ecological validity [1]. Importantly, these VR-based assessments occur stealthily – leaders engage in simulated tasks without feeling like they are taking a test. At the same time, the system unobtrusively collects data on their behaviour and reactions [1][4]. Such stealth assessment design, grounded in evidence-centred game design, enables capturing authentic leadership behaviours that might be missed or misrepresented in interviews or questionnaires [4]. Overall, the convergence of VR with organisational science in Prof. Alcañiz’s work has opened new avenues for creating engaging, interactive simulations that assess and enhance leadership skills in ways not achievable with conventional approaches.
Immersive VR for Leadership Assessment and Training
A cornerstone of this organisational neuroscience research is the development of immersive serious games and virtual scenarios for leadership evaluation. Prof. Alcañiz’s team introduced a three-dimensional serious game framework to embed participants (such as managers or job candidates) in realistic team-based situations requiring leadership competencies [4]. In one such VR simulation, virtual team members – interactive avatars with distinct personalities and roles – present the participant with workplace challenges (e.g. resolving a team conflict or making a strategic decision under pressure). The participant must navigate these scenarios by making decisions and communicating with the virtual team, effectively “learning by doing” in VR. During these sessions, the system records a wealth of behavioural data: decisions taken, response times, and even subtle actions like where the participant looks or how long they hesitate. This innovative evidence-centred design ensures that each virtual scenario is carefully crafted to elicit behaviours tied to specific leadership constructs (for example, a scenario may be designed to reveal a person’s tendency toward task-oriented vs people-oriented leadership) [4]. By seamlessly integrating assessment into gameplay, this approach yields objective performance indicators of leadership skills under lifelike conditions. Studies have demonstrated that such VR simulations can discern leadership styles: for instance, distinct decision-making patterns observed across a series of VR team problem-solving scenarios were found to correlate with classical leadership orientations (task-focused vs. relationship-focused) [4][5]. The VR assessments thus simultaneously capture multiple facets of leadership – decision-making, communication, and stress responses – overcoming the one-dimensionality of single questionnaires.
Beyond assessment, VR offers rich possibilities for leadership training. Because the virtual environments are software-controlled, so they can adapt to a leader’s actions in real time. Prof. Alcañiz’s group has highlighted how an intelligent tutoring system in VR could monitor a leader’s performance and dynamically adjust the scenario – for example, introducing an unexpected challenge if the participant is excelling or providing supportive feedback if the participant is struggling [1]. This closed-loop training capability (essentially a form of neurofeedback-driven coaching) directly extends their assessment research. By coupling real-time behavioural measures with adaptive scenario content, the VR system can personalise the training experience to target an individual’s weaknesses or to cultivate specific skills. Such an approach represents a shift from static training exercises to interactive, adaptive learning experiences for leadership development. Prof. Alcañiz and colleagues envision that shortly, organizations will use these VR platforms as part of routine leadership development programs, where leaders enter immersive simulations that not only evaluate their competencies with quantitative metrics but also help improve them through guided practice and instant feedback [1]. Early implementations of this vision in LabLENI’s research show that participants generally find the simulations engaging and realistic, suggesting high acceptance of VR as a leadership training medium. The novelty of this work lies in merging training with assessment: going through a challenging virtual leadership exercise yields data that can be used to profile the leader’s skills objectively while concurrently providing a developmental experience – effectively blurring the line between evaluation and learning.
Integrating Neuroscience and Implicit Behavioral Measures
A hallmark of Prof. Alcañiz’s organisational neuroscience contributions is incorporating biometric and neural measurements into VR-based leadership tasks. Just as his team did in clinical contexts, they brought tools like eye-tracking, physiological sensors, and machine learning analysis into the organisational domain to capture the implicit reactions of leaders during complex social interactions. In one series of studies, participants wore EEG brainwave caps and galvanic skin response (GSR) sensors while engaging in leadership-related activities in order to reveal the neurophysiological correlates of leadership styles [2]. This work was among the first to link objective neurophysiological signals with established leadership dimensions directly. For example, the research found that individuals who scored high on relationship-oriented leadership (ROL) showed distinct EEG patterns over frontal brain regions when exposed to emotional stimuli. In contrast, those differences were not present for task-oriented leaders [2]. These subtle brainwave variations – detectable only with sensitive equipment – suggest that leaders who excel in the people-focused aspects of leadership may process social-emotional information differently at the neural level. Notably, the study’s machine learning models were able to predict high vs. low relationship-oriented leadership with about 81% accuracy using EEG features alone, highlighting the potential of brain data as biomarkers of leadership aptitude [2]. Such findings are groundbreaking in bringing physiological evidence into leadership research, which has traditionally been the realm of psychology and management sciences. By demonstrating measurable brain and autonomic patterns associated with leadership behaviour, Prof. Alcañiz’s team has advanced the idea that leadership is not just a behavioural or personality construct but one that also has identifiable signatures in the nervous system. This neuroscience integration adds a new layer of rigour to leadership assessment – one that can validate (or sometimes challenge) what leaders report about themselves with what their bodies and brains reveal in simulated leadership situations.
In parallel, the LabLENI team has leveraged implicit behavioural cues – especially eye gaze – to indicate cognitive and emotional processes relevant to leadership and empathy. In their VR simulations, participants’ eye movements are tracked unobtrusively as they interact with virtual colleagues. Where and how long a leader looks at certain stimuli during a scenario can be highly informative. For instance, in a VR-based hiring or feedback session, a leader who consistently gazes at team members’ faces and social cues might be exhibiting a more relationship-centred approach, whereas one who focuses more on task-relevant objects or documents might lean task-oriented. Alcañiz’s research confirmed that eye-tracking metrics, combined with decision logs from the simulation, enable accurate classification of leadership style without any direct questions to the leader [5]. In a 2022 study, participants navigated workplace scenarios in VR while their gaze patterns and choices were recorded; a machine learning model successfully identified individuals with high versus low leadership scores based solely on these implicit measures [5]. Strikingly, eye gaze features contributed even more to the model’s accuracy than overt behavioural choices, underlining that subtle attentional differences are robust cues to a person’s leadership approach [5]. Similarly, the team applied this methodology to assess empathy in organisational contexts: they created immersive VR situations where participants had to interpret and respond to colleagues’ emotional needs and found that people’s eye movements and actions could distinguish those with high empathy from those with lower empathy on standardised tests [3]. Four facets of empathy (perspective-taking, emotional understanding, empathetic stress, and empathetic joy) were each predicted by visual attention patterns and in-game decisions during these social scenarios [3]. These results are significant because they demonstrate a viable new assessment procedure for soft skills like empathy – one that uses behavioural biomarkers gathered in a controlled yet realistic virtual workplace, analysed by machine learning, to infer psychological traits [3]. The work on empathy and leadership style shows the broad applicability of immersive tools: whether the target trait is a leadership competency or a socio-emotional capacity, VR scenarios combined with AI-driven analytics can yield quantitative assessments that complement or surpass traditional pen-and-paper measures. Notably, another study in the lab extended this approach to fundamental motivational traits, using a serious VR game to assess participants’ core psychological needs (such as the need for attachment or self-esteem) based on their in-game behaviours. The virtual game identified individuals’ dominant psychological needs with high accuracy, outperforming questionnaires and avoiding biases like social desirability [6]. By capturing how people behave in challenging or emotionally salient situations, the researchers tapped into the implicit drivers of behaviour that conventional tests often fail to reveal [6]. Across these investigations, Prof. Alcañiz’s team has consistently pushed the boundary of assessment science – demonstrating that immersive simulations instrumented with biosensors can uncover latent attributes of individuals in organisational settings, from leadership style to empathy to motivation. This fusion of neuroscience and behavioural data with VR not only provides more objective and multifaceted profiles of individuals but also represents a new scientific approach to studying leadership and organizational behaviour with the experimental precision of the lab and the realism of the field.
XR-Based Behavioral Biomarkers in Leadership
Building on these empirical advances, Prof. Alcañiz has helped articulate a unifying concept for the field: XR-Based Behavioral Biomarkers (XRBB) for leadership. XRBB refers to objectively measurable patterns of behavior or physiology, obtained within an extended reality (XR) environment, that serve as indicators of an individual’s leadership-related capacities or state. In simpler terms, an XR-based behavioral biomarker is a digital fingerprint of leadership performance elicited in an immersive simulation. This concept directly extends the team’s prior introduction of VR-based biomarkers in clinical neuroscience to the organizational domain. In the context of leadership, XRBBs might include a characteristic eye-gaze pattern in a team interaction scenario, a spike in heart rate variability when handling an unexpected crisis in VR, or a consistent strategy in decision-making games that reflects a certain leadership style. The key idea is that by placing a person in a lifelike yet controlled XR situation that replicates real-world organizational challenges (for example, negotiating with a virtual client, or managing a conflict between virtual employees), one can capture naturalistic reactions that are quantifiable. These reactions – spanning where the person looks, how they communicate, physiological stress responses, timing of actions, etc. – form a rich data stream. Through advanced computational analysis, including machine learning, specific features in these data can be identified that correlate strongly with leadership efficacy, style, or potential. Organizational Neuroscience (ON) frameworks embrace such implicit measures to better understand workplace behavior, and XR technologies provide the ideal vehicle to gather them [8]. Alcañiz et al. (2024) emphasize that XR environments offer interactive, multi-sensorial stimuli that engage leaders in complex tasks, thereby eliciting the very behaviors and brain responses that traditional assessments struggle to capture [8]. By measuring these responses, one can derive leadership biomarkers that are far more objective and context-sensitive than legacy assessments. For example, an XRBB for adaptive leadership might be defined as the variability in a leader’s stress (electrodermal activity) signal when confronted with rapidly changing scenarios – a stable low variability might indicate calm adaptability, whereas erratic spikes could signal difficulty in coping with change. Similarly, an XRBB for empathic leadership might be the proportion of time a leader spends making eye contact with distressed team members in a simulation, relative to focusing on tasks. These are not mere conjectures; they are the kinds of measures that have begun to emerge from Prof. Alcañiz’s studies. By formalizing the XRBB concept, the research provides a conceptual roadmap for future leadership assessment tools that leverage immersive tech. In practical terms, the XRBB approach means that leadership assessments in the 21st century can shift toward neuroscientific, data-driven methods: instead of filling out questionnaires about one’s leadership style, a candidate might engage in a 30-minute VR assessment module, after which a profile of their leadership biomarkers is generated. Those biomarkers could then predict real-world outcomes like team performance or employee engagement, making them valuable for recruitment, training personalization, and talent development decisions [8]. This vision aligns with a broader trend identified in the field – a “compromise” between modern training needs and outdated evaluation methods can be resolved by importing neuroscientific techniques into management, using XR interfaces as the medium [8]. Prof. Alcañiz’s contributions thus lie not only in individual experiments but in pioneering a holistic framework (XRBB) that marries immersive technology with organizational assessment. This framework positions XR as more than a gimmick – it becomes a serious research and practical tool to derive quantitative biomarkers for complex leadership constructs. As a result, his work is helping to transform how organizations might identify and nurture leadership talent, moving towards assessments that are engaging, evidence-based, and rooted in how people actually behave under realistic conditions.
Multimodal Extensions and Future Directions
While VR and XR simulations are central to this organizational neuroscience agenda, Prof. Alcañiz’s team has also explored other biometrics and modalities to broaden the scope of leadership and HR assessment. One notable extension is in the domain of vocal communication, an area highly relevant to leadership effectiveness. The team has investigated how emotional and linguistic patterns in speech can serve as indicators of psychological states or competencies in professional settings. For example, they developed the EMOVOME (Emotional Voice Messages) database, a large collection of spontaneous voice messages recorded from real conversations, to facilitate machine learning research on emotion recognition in speech [9]. By analyzing this dataset, the researchers aim to identify vocal features that betray authentic emotions – excitement, frustration, empathy – as they occur in natural communication. Such vocal biomarkers could be invaluable in organizational contexts: a leader’s tone of voice in a meeting or a salesperson’s vocal emotion when pitching to a client might predict outcomes like team morale or sales success. The creation of EMOVOME demonstrates the team’s commitment to capturing human behavior in the wild (outside the lab) and leveraging it for organizational insights [9]. In parallel, they have studied the semantics of speech in simulated versus real interactions. In a novel study bridging neuroscience and human resource management, Prof. Alcañiz and colleagues compared how job applicants respond in a VR job interview versus a traditional face-to-face interview, focusing on the content and sentiment of their answers [10]. By examining the language used in each setting, they sought to understand whether VR elicits different communication patterns or stress levels compared to in-person meetings. The findings suggested that well-designed VR interviews can produce conversational dynamics and linguistic markers comparable to those in physical interviews, supporting the viability of VR for remote recruitment and assessment [10]. Moreover, subtle differences in word choice and emotional tone between the two settings offered clues about how the medium (virtual vs. physical) might influence a candidate’s cognitive state and self-presentation. This line of inquiry – using natural language processing and voice analysis alongside VR – enriches the overall organizational neuroscience approach. It indicates that a comprehensive assessment of a leader or employee could involve not only data from immersive simulations (behavioral and physiological XRBBs) but also analysis of how they speak and express themselves in various contexts. The multimodal strategy championed by Prof. Alcañiz’s group is thus painting a more complete picture of human behavior at work: brain waves, eye movements, heart signals, decisions, and voice are all pieces of the puzzle. By harnessing these diverse data streams through interdisciplinary methods, the research is pushing the boundaries of how we define and measure constructs like leadership, emotional intelligence, and job performance.
In summary, the author’s work in organizational neuroscience represents a significant conceptual and technological leap for leadership research and practice. It introduces immersive XR technology as a powerful methodology to observe leadership and social behavior with unprecedented realism and objectivity [1][7]. It integrates neuroscientific measures – from EEG to gaze tracking – to uncover the implicit processes underlying effective leadership and teamwork [2][5][3]. It applies machine learning to translate complex data into meaningful profiles, enabling the identification of behavioral biomarkers that can predict leadership styles or empathy levels [5][3][2]. It also expands into new modalities like voice, recognizing that communication patterns are an integral part of the leadership spectrum [9][10]. Across these contributions, a unifying theme is the enhancement of ecological validity and rigor in assessing human factors in organizations: by measuring people in action, in context, and through multiple channels of data, Prof. Alcañiz’s approach mitigates the limitations of self-report and static tests. The novelty of this body of work is evident in the development of the XRBB concept – a forward-looking idea that encapsulates how future assessments can derive objective leadership indicators from immersive experiences [8]. Ultimately, these innovations pave the way for a new generation of leadership development tools that are data-driven and personalized. Leaders can be evaluated not just by what they say about themselves or by others’ subjective opinions, but by how they actually behave under realistic simulations, as captured by neural and behavioral sensors. Such a shift has profound implications: it brings scientific precision to fields like management training and talent assessment, and it enables ongoing feedback and improvement. Prof. Alcañiz and his team stand at the forefront of this emerging field, and their organizational neuroscience research is transforming theoretical ideas into practical, impactful solutions. By bridging VR, XR, AI, and neuroscience within organizational settings, they have demonstrated a model for interdisciplinary innovation that is advancing both our understanding of leadership and the tools available to cultivate it.
References:
[1] Alcañiz, M., Parra, E., & Chicchi Giglioli, I. A. (2018). Virtual reality as an emerging methodology for leadership assessment and training. Frontiers in Psychology, 9, 1658.
[2] Parra Vargas, E., Philip, J., Carrasco-Ribelles, L. A., Chicchi Giglioli, I. A., Valenza, G., Marín-Morales, J., & Alcañiz Raya, M. (2023). The neurophysiological basis of leadership: a machine learning approach. Management Decision, 61(6), 1465–1484.
[3] Parra Vargas, E., García Delgado, A., Torres, S. C., Carrasco-Ribelles, L. A., Marín-Morales, J., & Alcañiz Raya, M. (2022). Virtual reality stimulation and organizational neuroscience for the assessment of empathy. Frontiers in Psychology, 13, 993162.
[4] Parra, E., Chicchi Giglioli, I. A., Philip, J., Carrasco-Ribelles, L. A., Marín-Morales, J., & Alcañiz Raya, M. (2021). Combining virtual reality and organizational neuroscience for leadership assessment. Applied Sciences, 11(13), 5956.
[5] Parra, E., García Delgado, A., Carrasco-Ribelles, L. A., Chicchi Giglioli, I. A., Marín-Morales, J., Giglio, C., & Alcañiz Raya, M. (2022). Combining virtual reality and machine learning for leadership styles recognition. Frontiers in Psychology, 13, 864266.
[6] Giglioli, I. A. C., Carrasco-Ribelles, L. A., Parra, E., Marín-Morales, J., & Alcañiz Raya, M. (2021). An immersive serious game for the behavioral assessment of psychological needs. Applied Sciences, 11(4), 1971.
[7] Parra, E., Alcañiz, M., Giglio, C., & Chicchi Giglioli, I. A. C. (2022). Use of XR technologies for the assessment and training of leadership skills. In Roadmapping Extended Reality: Fundamentals and Applications (pp. 321–335). Springer.
[8] Alcañiz, M., Parra, E., Chicchi Giglioli, I. A. C., & García, A. (2024). Using virtual reality for leadership assessment and training through behavioral biomarkers. In Biometrics and Neuroscience Research in Business and Management: Advances and Applications (pp. 141–170). De Gruyter.
[9] Gómez-Zaragozá, L., del Amor, R., Parra Vargas, E., Naranjo, V., Alcañiz Raya, M., & Marín-Morales, J. (2024). Emotional Voice Messages (EMOVOME) database: emotion recognition in spontaneous voice messages. arXiv preprint arXiv:2402.XXXX.
[10] Alcañiz Raya, M., Parra Vargas, E., Philip, J., Gómez-Zaragozá, L., Marín-Morales, J., & García Delgado, A. (2022). Neuroscience & HRM: Comparing the semantics of vocal responses in face-to-face and VR job interviews. Academy of Management Proceedings, 2022(1), 10757.