CONSUMER BEHAVIOUR ACHIEVEMENTS

Virtual Reality (VR) and related immersive media have emerged as transformative tools in marketing and consumer research, offering unprecedented ways to simulate shopping experiences and capture consumer responses. Prof. Mariano Alcañiz and colleagues at LabLENI have been at the forefront of this integration, providing a rigorous framework for applying Extended Reality (XR) in marketing science. This framework formalized how VR, augmented, and mixed reality can serve as methodological platforms – not just gimmicks – in consumer research, highlighting VR’s unique ability to combine high ecological realism with precise control and measurement. Using VR, researchers can place consumers in lifelike virtual stores or service environments and simultaneously record their gaze, motions, and physiological signals, something traditional methods could barely achieve. For example, one early contribution outlined how VR marketing studies can track non-verbal behaviors through body-motion capture and eye-tracking in real time, while also gathering psychophysiological metrics (heart rate, skin conductance) synchronized with the virtual stimuli. Such multisensor immersion allows consumer reactions to be analyzed with a fidelity and depth unattainable in standard surveys or lab tasks – addressing a longstanding trade-off between experimental control and realism in consumer research.

LabLENI’s empirical studies have leveraged this capability to yield practical insights. In one study on virtual commerce, participants navigated a VR shopping environment (a “virtual store”) to test how immersive online retail compares to conventional web interfaces. The research demonstrated that VR can significantly affect consumer behavior: for instance, when consumers used VR headsets to shop, they experienced higher engagement and made more confident purchase decisions than those using a standard 2D website.  These findings, published as the first analyses of “V-commerce,” underscored that a more immersive presentation (360° views of products, the ability to virtually pick up items) translated into measurably enhanced marketing outcomes such as increased purchase intent. Similarly, in the context of destination marketing, the team showed that VR panoramas can boost persuasive impact. Tourists who took an immersive 360° virtual tour of a city reported a stronger sense of presence and significantly higher intentions to visit the real location than those who watched a conventional video, an effect mediated by the heightened realism and interactivity of VR. Notably, the VR condition also evoked higher physiological arousal, indicating deeper emotional engagement which, in that study, positively influenced their likelihood to recommend the destination. These results validate that immersive media are not only novel but more effective at eliciting consumer interest and desire, lending support to the idea that XR technologies can serve as superior marketing channels.

Implementing such studies required methodological innovation as well. For instance, Alcañiz’s team developed custom solutions to ensure data quality in VR settings – one being a tailored eye-tracking fixation identification algorithm for head-mounted displays. In immersive environments, a user’s head and eye movements are more dynamic than in desktop setups, complicating the detection of what the user is truly looking at. The LabLENI researchers addressed this by calibrating new parameters for fixation detection specifically for VR, ultimately enabling reliable analysis of visual attention in 360° scenes. This technical contribution has been vital for consumer neuroscience studies, as it allows precise tracking of which products or ad elements draw a shopper’s gaze in a virtual store [10]. The group’s commitment to integrating engineering advances with behavioral science is further evident in their early adoption of neuroimaging for marketing: even before VR became widespread, they quantified brainwave responses (EEG) to multimedia content like sports broadcasts to gauge audience engagement [20]. That early work laid the groundwork for today’s VR experiments by illustrating that physiological data could enrich our understanding of media consumption. Taken together, these efforts firmly established immersive technology as a serious, scientifically grounded approach in consumer research – effectively creating highly instrumented “virtual laboratories” where theories of consumer behavior can be tested under realistic conditions.

Emotion Recognition and Immersive Affective Computing

A core innovation in this body of work is the use of immersive environments to study and quantify consumer emotions. Emotions play a crucial role in decision-making and brand experience, yet traditional consumer research has struggled to measure them objectively. Alcañiz’s team tackled this challenge by combining VR with affective computing techniques, allowing them to elicit genuine emotional reactions and capture them via biometric sensors. In 2020, they published a comprehensive review of emotion recognition methods in immersive settings, highlighting that VR’s high presence and interactivity offer clear advantages over 2D stimuli for evoking authentic emotional responses. The review noted that VR, especially when paired with implicit measures and machine learning, “has the potential to impact transversely in many research areas, opening new opportunities for the scientific community”. In other words, by presenting users with lifelike scenarios and measuring their subconscious reactions, one can gain insights into emotional processes that were previously inaccessible. This perspective set the stage for a new research paradigm where emotions are studied in context-rich simulations (like virtual supermarkets or virtual cinemas) rather than static images or surveys.

LabLENI’s experimental studies put these ideas into practice. In a pioneering Scientific Reports article, the team developed an emotion-recognition system entirely within a VR environment. Participants were immersed in specially designed virtual scenes intended to induce specific emotional states corresponding to the quadrants of the circumplex model (combinations of high/low arousal and positive/negative valence). While in VR, their neural and cardiac signals were recorded using EEG and ECG sensors. Remarkably, by extracting features from brain waves and heart rhythms and feeding them into machine-learning classifiers, the system could automatically distinguish a user’s emotional state with around 75% accuracy for arousal and 71% for valence. These results were among the first to validate VR as not only an emotion elicitor but also a quantitative emotion assessment tool [4]. They showed that immersive simulations can reliably provoke distinct emotional responses (akin to real-life feelings of calm, excitement, etc.) and that those responses register in measurable physiological patterns. As the authors noted, this capability opens doors for applications in fields as diverse as experience design, health, and marketing, where understanding consumers’ emotional reactions to products or environments is crucial.

Crucially, the team also asked: Are emotions experienced in VR truly comparable to those in the real world? They directly addressed potential skepticism about VR’s ecological validity through a set of studies comparing responses in virtual vs. real environments. In one such study, participants freely explored an art gallery in both physical reality and a high-fidelity VR reproduction of the same gallery. Throughout both experiences, the researchers collected self-reported emotions and physiological signals (again via wearable EEG and ECG). The findings were encouraging for immersive research – self-reports indicated that the virtual museum elicited emotional arousal and valence levels similar to the physical museum, and classifiers trained on physiological metrics could predict those self-reported emotional states with comparable accuracy in both settings. In fact, the classification accuracy for high vs. low emotional arousal in VR was about 75.0%, nearly matching the 71.5% in the real museum. This parity suggests that well-designed VR scenarios can trigger and capture genuine emotional responses nearly as well as real-world scenarios, a critical validation for the use of VR in consumer emotion research [7]. At the same time, the team’s nuanced analyses uncovered subtle differences: for example, heart rate variability (HRV) measures revealed that while the real museum visit produced certain autonomic nervous system changes between high- and low-arousal states, the VR visit did not show the same pattern. In the real environment, high-arousal moments (viewing exciting or moving artworks) were associated with distinct HRV shifts – specifically, reductions in vagal activity – that signal a fight-or-flight response. The VR environment elicited strong subjective arousal, but those particular HRV changes were blunted or absent in VR. The authors interpreted this to mean that while VR can psychologically convince us and induce emotion, certain deep physiological reflexes might remain tied to real-world cues (perhaps because the body “knows” it’s not truly in a risky or novel environment) [9]. This insight is important: it tempers enthusiasm with the recognition that immersive media must be carefully evaluated for each response modality. It suggests that future VR-based consumer studies should consider which physiological or behavioral signals translate well from reality and which may require enhanced immersion (e.g. adding tactile or more intense stimuli) to fully engage.

Beyond visual immersion, LabLENI’s affective computing research has extended into analyzing emotions in naturalistic, multimodal contexts. In one project, they ventured outside of VR to decode emotion from voice data – specifically, from social media voice messages recorded “in the wild” [19]. These short audio messages (the kind exchanged on WhatsApp or similar platforms) carry rich emotional undertones in tone and speech patterns. The team developed algorithms to perform speech emotion recognition on this noisy, spontaneous audio, demonstrating their interdisciplinary expertise from signal processing to psychology. The inclusion of this work in an extended CV of consumer neuroscience highlights a broader theme: whether through VR simulations or real-world media like voice, Prof. Alcañiz’s research seeks to capture the implicit, genuine emotional reactions of consumers. By mastering tools to measure emotions – be it via a headset’s biometric sensors or an algorithm parsing a customer’s excited voice note – his body of work enables a more profound understanding of consumer affect than traditional self-report alone. This is foundational for consumer neuroscience because emotions often drive preferences and choices at a subconscious level. Through immersive affective computing, these once-hidden drivers become observable and quantifiable.

Neurophysiological Metrics and Advertising Effectiveness

Another major thrust of this research program is the application of neuroscience-based metrics to evaluate advertising and media effectiveness. Traditional ad research might ask viewers if they liked an ad or remember it; Alcañiz’s approach is to watch the watchers – using tools like EEG, eye-tracking, and skin response to objectively gauge attention, emotion, and memory encoding during advertisements. In doing so, his team has pioneered predictive neuromarketing, showing that physiological data can foreshadow real-world market success. A landmark study in this vein asked whether the effectiveness of new video ads (in this case, YouTube ads) could be predicted from consumers’ neurophysiological responses in the lab. Thirty-five participants watched a set of TV commercials (including Super Bowl ads) while wearing sensors that tracked their brain signals, eye movements, and heart-rate variability. The team then correlated these measurements with each ad’s actual performance “in the wild” – including the number of views and likes the ads garnered online over the next year. The result was striking: neuroscience-based metrics significantly correlated with real-world outcomes, and using a machine learning model (a neural network), the researchers could predict with about 83% accuracy which ads would be most successful, as well as estimate the relative view counts. This was the first study to demonstrate prospectively that neurophysiological responses can predict an advertisement’s future viral success, beating traditional surveys in foresight. It provided hard evidence to marketing practitioners that methods from the lab (like measuring subconscious engagement via brainwaves or galvanic skin response) are not just academic – they translate to business KPIs like views, recall, and brand liking [6]. The authors noted this approach could be used at the design stage of advertising: by testing concepts with a small sample and neural measures, advertisers might optimize content before spending millions on airplay. In essence, this work introduced a data-driven paradigm for ad testing, moving toward a scenario where “if the brain likes it, the market will too.”

Even before this predictive analytics study, the team had been exploring how specific neurophysiological signals relate to advertising impact. In a 2016 experiment, they recorded EEG, heart activity, skin conductance (GSR), and respiration from viewers watching a series of TV commercials embedded in other video content. By extracting dozens of features from these signals (spanning brain wave frequencies, heart rate variability, GSR peaks, etc.) and inputting them into various classifiers, they aimed to discriminate whether a given ad was emotionally effective or not. Effectiveness here was defined in terms of an industry metric (the “Ace Score” index, which reflects viewers’ evaluations of the ad). Impressively, the best machine learning model could correctly classify ~89.8% of the ads as high or low effectiveness based purely on the physiological responses they invoked. The most diagnostic signals came from electrodermal activity and heart-rate variability features, which are proxies for emotional arousal and attention. This indicated that moments of heightened skin conductance or certain heart rhythm patterns in viewers tended to correspond to ads that were later rated as more impactful. Such a finding is important for theory: it suggests that the emotional arousal dimension (captured implicitly through biosignals) is a key driver of whether an ad resonates. From a methodological perspective, this study provided a comparative evaluation of analysis techniques, showing which combinations of signal features and classification algorithms yielded the highest accuracy in reading consumers’ subconscious reactions [11]. It thus serves as a foundation for subsequent neuromarketing studies, illustrating a template of how to process complex physiological data for meaningful marketing insights.

Beyond measuring how much an ad engages viewers, Prof. Alcañiz’s work also delves into how different advertisement formats or content choices influence the audience’s cognitive and emotional response. One timely question in recent years has been the efficacy of 360-degree video ads – advertisements shot in an interactive 360° format that viewers can pan around, often experienced in VR or on smartphones. The team carried out one of the first neuroscientific comparisons between 360° ads and traditional flat video ads, using a combination of eye-tracking, EEG, facial emotion coding, and skin sensors to evaluate viewer responses. The findings revealed a nuanced trade-off. On one hand, 360-degree ads produced greater positive emotional engagement for certain products: for example, in ads for exciting, high-involvement products (like travel destinations or electronics), the immersive format elicited more joy and interest than a regular video. This aligns with the idea that interactivity and freedom (being able to look anywhere in the ad) can heighten a sense of novelty and enjoyment. On the other hand, this very freedom reduced visual focus on specific ad elements. The study noted that for fast-moving consumer goods (FMCG) ads, viewers spent less time fixated on the product and brand logos in 360° videos compared to 2D videos . Consistently, brand recognition scores were lower after watching 360° ads – presumably because some viewers failed to notice brand cues when they were busy exploring the scene. Thus, while the 360° format boosted emotional appeal (particularly for durable goods ads, where it “carried the risk of non-exposure to some content” but overall increased positive emotions, it also introduced a risk that the marketing message might be diluted or partially missed. This insight is invaluable for advertisers considering immersive content: it suggests that 360° ads should be used judiciously, perhaps ensuring that critical brand information is placed centrally or repeatedly, knowing that viewers’ attention is more spread out [14]. By quantitatively documenting both the pros and cons of immersive advertising, the research provides a roadmap for balancing engagement vs. attention in next-generation ads.

In another creative study, the team examined the interaction of music and message in advertising – specifically, how the congruence (or mismatch) between background music and the content of a TV ad affects audience response. Conventional wisdom in advertising has long advocated for music that “fits” the ad’s visuals and narrative, on the assumption that congruent music creates a pleasant, unified experience. However, using neurophysiological measures, Alcañiz and colleagues discovered a counterintuitive effect: incongruent music can sometimes make an ad more memorable [17]. In an experiment, participants viewed ads that were carefully edited to have either a matching soundtrack (e.g., upbeat music with upbeat visuals) or a deliberately contrasting one (e.g., calm music with fast-paced visuals), while their brain activity and other responses were recorded. The results showed that ads with mismatched background music triggered higher attentional engagement – viewers’ EEG indicated greater processing effort and alertness, and their subsequent recall of those ads was often better. The likely explanation is that unexpected music violates expectations and thus grabs mental resources: the viewer subconsciously thinks, “hmm, this feels odd,” which leads them to pay closer attention. By contrast, congruent music, while creating a smooth experience, might allow the mind to relax a bit more, potentially reducing the depth of processing. It’s a nuanced finding that challenges a decades-old assumption in advertising. The neurophysiological data (including patterns of frontal alpha asymmetry, which relate to approach/withdrawal motivations) provided evidence that incongruent music stirred more cognitive conflict and interest, whereas congruent music kept the brain in a more routine state. From a practical standpoint, this suggests that advertisers could strategically use a touch of dissonance to make ads stand out – a bold shift from always aiming for seamless congruence. Importantly, these insights arose only because the study looked beyond self-report to implicit measures: viewers might not say they liked an ad with odd music better, but their brains and memory performance told a revealing story. This exemplifies how consumer neuroscience can uncover hidden factors that influence ad effectiveness, guiding more innovative creative decisions.

Behavioral Biometrics and Personalization in Virtual Shopping

Perhaps the most forward-looking aspect of Prof. Alcañiz’s consumer neuroscience portfolio is the use of immersive technology to identify and respond to individual differences – essentially turning VR behavior into behavioral biometrics for personalization. The central idea is that how people behave in a virtual environment can serve as a unique signature of their psychology, just as a fingerprint or eye pattern identifies them physically. This represents a conceptual leap: using rich behavioural data from simulations to infer traits about the consumer, enabling experiences to be tailored to their profile in real time. The LabLENI team’s virtual shopping studies illustrate this vision in action.

First, they demonstrated that VR shopping environments yield an unprecedented level of behavioral detail about consumer habits. In a VR supermarket study, participants’ movements and visual attention were tracked as they browsed and selected products. The analysis revealed systematic patterns – for example, the amount of time shoppers spent looking at items, picking them up, or navigating certain aisles differed between product types. The researchers found that these metrics could distinguish between shopping for hedonic products (leisure or luxury items) versus utilitarian products (everyday necessities) [13]. In virtual stores, consumers searching for utilitarian goods tended to follow more efficient paths and fixate quickly on product information, whereas those in hedonic browsing mode exhibited longer exploration and more frequent interactions (like picking up and examining items). Such findings, made possible by full-recording of behavior in VR, provide granular insight to retailers – for instance, indicating that an immersive store could dynamically adapt its layout or prompts depending on whether a customer’s actions suggest a goal-oriented mission or an exploratory mood. This level of adaptive strategy simply wasn’t conceivable with coarse real-world observations alone. By showing that behavior varies with context in measurable ways in VR, the groundwork was laid for linking behaviors to personal characteristics.

The next step was truly groundbreaking: using VR-behavior data to classify individual consumer traits. In a 2022 study, Alcañiz’s group asked whether a shopper’s personality profile could be inferred from their interactions in a virtual retail store. Sixty participants first took a standard Big Five personality questionnaire and then performed tasks in a VR supermarket (such as free exploration and directed product searches). Without knowing the questionnaire results, the researchers fed the VR-collected data – eye movements, navigation paths, pauses, and object manipulations – into machine learning models to see if they could predict the personalities. Remarkably, the model identified certain trait signatures: for instance, individuals who scored high on Openness to Experience exhibited distinctive gaze patterns, tending to visually scan more broadly and look at a wider variety of products (as if curious and exploring), whereas those high in Extraversion showed more outgoing movement and interaction, such as approaching more products or spending more time in social areas of the virtual store. By combining all behavioural indicators, the system could classify participants’ personality trait levels with significant accuracy above chance [8]. This was a proof-of-concept that VR shopping behavior can act as a proxy for a psychological profile. Such information could be invaluable – a virtual store could detect that “this user seems very open-minded and exploratory” and then recommend novel, eclectic products to match their style, or conversely recognize a conscientious, detail-focused shopper and provide more extensive product information.

In a similar vein, another study focused on impulsive buying tendencies – a trait of great interest to retailers – and whether it can be detected in VR. Here, the team correlated users’ in-VR behavior with their scores on an impulsivity scale (the Barratt Impulsiveness scale). They found that highly impulsive individuals had telltale behaviors in the virtual store: for example, during an unstructured browsing task, they were more prone to rapidly switching directions and making quick purchase decisions with minimal deliberation. During more goal-directed tasks, those same individuals showed less perseverance – if they didn’t find a target item quickly, their search strategy deteriorated faster compared to low-impulsives. By feeding numerous such features into an SVM classifier, the researchers could identify participants with high versus low impulsivity with about 87% accuracy [18]. This is a striking result: it implies that a VR shopping app could, in principle, sense if a user is likely impulsive (versus methodical) and perhaps adjust the experience accordingly – for instance, by inserting just-in-time discount offers for those susceptible to impulse, or conversely, providing comparison tools for those who are more deliberative. The study also noted which signals were most informative for impulsivity: interestingly, posture and interaction patterns mattered a lot. Highly impulsive shoppers tended to have more restless head movements and more frequent short interactions (grabbing many items quickly), whereas less impulsive people had steadier postures and longer, more focused interactions with fewer items. These insights enrich consumer research by linking physical movement traits to underlying decision styles – something rarely possible outside of VR.

Finally, extending this line of “behavioral biometrics,” the team showed that even basic demographic attributes – like age and gender – can be predicted from VR shopping behavior with notable accuracy. In a 2023 Virtual Reality journal article, they reported that by using features of gaze, navigation, and hand movements, a machine learning model could classify a shopper’s gender and age group (>70% accuracy for gender, for example) just from a few minutes of VR interaction. Older adults in the study, for instance, navigated the virtual store more slowly and thoroughly, whereas younger participants moved more rapidly and were prone to make selections more impulsively; gender differences emerged in gaze patterns on certain product categories and in head movement dynamics. While demographic detection might be less conceptually novel than personality detection (after all, retailers often try to infer demographics), the fact that it can be done without any questionnaires or profile login – purely via behavior – is technologically impressive [16]. It hints at a future where a new customer could enter a virtual shop anonymously, and the system, reading their subtle behaviors, adapts its interface (from product recommendations to avatar representations) to what it predicts about that customer’s profile.

Collectively, these studies push the envelope of personalization and privacy in consumer research. They effectively port the concept of “biomarkers” (so central in Alcañiz’s clinical research) into the consumer domain: here, instead of biomarkers of autism, we have biomarkers of shopping style and preference. The novelty lies in treating behavior in immersive environments as data rich enough to decode individual differences. This approach contrasts with traditional market segmentation that relies on surveys or past purchase data; instead, it infers traits on the fly from natural behavior. The implications are profound. On one hand, it promises more personalized and engaging experiences – a longstanding goal of marketing. On the other, it raises ethical questions about consumer consent and data use when a system can know you’re impulsive or extraverted without you telling it. By being among the first to demonstrate these capabilities, Prof. Alcañiz’s team has sparked conversations on how virtual retail analytics might be harnessed in the coming years. In any case, from a scientific perspective, this work is a clear paradigm shift, showing that immersive technology can not only measure how consumers act, but also who they are.

Multisensory and Extended Reality Innovations

A recurring theme in Alcañiz’s research is the exploration of multisensory inputs and novel interface modalities to see how they influence consumer psychology. The team’s studies do not confine themselves to vision and motion; they have incorporated senses like smell, and examined platforms beyond VR, such as Mixed Reality (MR), thereby broadening the scope of consumer neuroscience. These investigations add an extra dimension of novelty because they touch on sensory factors and technologies that traditional consumer research often overlooks.

One particularly innovative study examined the effect of human chemosignals (body odors) on consumer decision-making. While ambient scents like lavender or citrus are known to affect shopper mood, this work ventured into new territory by using subliminal human odors as social primes in a retail context. In a controlled experiment, participants were exposed to almost imperceptible amounts of body odor collected from individuals in different emotional states (e.g., fear-induced sweat versus “happy” sweat versus a neutral clean scent) while making purchase decisions. The findings were striking: consumers who were unknowingly primed with any human emotional odor – whether fearful or happy – reached their purchase decisions significantly faster than those in a no-odor condition. In other words, just the subtle presence of human body odor (undetected at a conscious level) sped up decision-making in a shopping task [1]. The authors interpreted this as evidence that human odors cue an unconscious sense of “presence of others,” which can influence social and cognitive behavior. In evolutionary terms, the smell of another person might signal that it’s a social situation, potentially triggering a slight increase in arousal or competitive urgency (“someone else is here, I should decide”). The broader implication for consumer environments is fascinating: whereas retailers have used pleasant fragrances to improve ambiance, this research suggests that even social chemosignals – essentially, the scent of other humans – could affect shopping pace and perhaps outcomes. It must be noted that using body odors in stores is not practical or necessarily desirable, but scientifically, this study expanded the understanding of olfaction in consumer behavior. It proved the concept that non-conscious olfactory cues can bias consumer decisions, opening up a new line of inquiry into how social presence cues (whether through VR avatars or subtle sensory inputs) might enhance or alter the consumer experience. This is a clear example of the lab’s willingness to test bold, novel hypotheses about sensory influences that standard marketing research would rarely consider.

In parallel, the LabLENI group explored Mixed Reality as an emerging medium to blend digital and physical shopping. Mixed Reality (MR), such as seeing virtual objects overlaid on real ones through smart glasses, promises to combine the convenience of online shopping with the tactile confidence of in-store browsing. In a 2023 study, the team evaluated how MR affects user behavior and experience in a retail scenario [15]. Participants donned MR glasses and interacted with physical products that had virtual information or enhancements layered onto them (for example, a shopper could see additional holographic details, like nutrition info or styling options, hovering next to a product, and even virtually try an item without unboxing it). The experiment compared this MR-assisted shopping with a traditional setup (no MR, just physical products and printed info). The results showed clear differences: MR users interacted with products more richly and frequently, as the virtual overlays invited them to examine items from all angles, trigger animations, or see customizations. Metrics like the number of product touches and interaction duration increased with MR, indicating deeper engagement [15]. Interestingly, decision-making metrics shifted as well. For relatively straightforward, utilitarian purchases (say a basic appliance), MR tended to shorten decision time, presumably because the augmented information helped users quickly resolve uncertainties (they could instantly pull up specs, see the item in use virtually, etc.). However, for hedonic purchases or very novel products, MR sometimes led to more exploration, which could lengthen the deliberation – not a negative outcome, as it signaled engagement. From self-reports, shoppers overwhelmingly found the MR-enhanced experience both more enjoyable and more useful than the conventional one. They rated higher satisfaction, higher entertainment, and also felt they learned more about the products. Crucially, these positive responses translated into behavior: those who used MR expressed stronger purchase intentions and willingness to recommend the experience to others. The study therefore provided some of the first empirical evidence that MR can boost both the hedonic value (fun, novelty) and utilitarian value (effectiveness, confidence) of shopping. For industry, it offered a validation that investments in AR/MR shopping technology can pay off in terms of customer satisfaction and potentially sales. For science, it revealed how altering the vividness and interactivity of the retail experience (two key dimensions often discussed in media theory) directly impacts consumer cognition and emotion. This MR research complements the VR work, reinforcing a central message: increased sensory richness and interactivity often lead to more engaged and positive consumers. Yet, the MR study also underscores the importance of context – showing, for instance, that the impact can vary by product type (with MR accelerating some decisions while prolonging others). Such fine-grained insights are vital for developing a theoretical understanding of when and how immersive tech influences consumer behavior.

Through these multisensory and MR investigations, Alcañiz’s team has essentially been stress-testing the boundaries of immersive consumer research. They ask: what if we engage smell in addition to sight and sound? What if we let the consumer’s real and virtual worlds merge? Each answer adds a new piece to the puzzle of human behavior in mediated environments. Notably, these contributions often represent “firsts” – e.g., the first demonstration of emotional human scents affecting shopping decisions, or one of the first rigorous user studies of MR in retail – highlighting the novelty that characterizes the lab’s approach. By expanding the sensory and technological repertoire of consumer neuroscience, this work ensures that the field remains at the cutting edge, keeping pace with emerging tech that is increasingly relevant to how modern consumers shop and experience brands.

Novelty and Impact

In summary, the collection of research in Prof. Alcañiz’s extended CV on consumer neuroscience and immersive technology showcases a series of conceptual and methodological breakthroughs that have substantially broadened the horizons of marketing and consumer research. One clear through-line is the reimagining of where and how we study consumers. Traditionally confined to surveys, focus groups, or at best one-way mirror lab studies, consumer research can now take place in fully instrumented virtual environments that capture the richness of real-life behavior. This is a paradigmatic shift – much as his team has championed XR-based behavioral biomarkers in clinical research, here they introduced XR as a powerful lens for consumer behavior. The Virtual Experience Marketing framework Alcañiz co-authored in 2019 distilled this vision, laying out how immersive technologies enable a new kind of experiment where realism and control need not be trade-offs but are combined. This framework and the body of evidence supporting it have given academic researchers a roadmap for rigorous XR studies, and it earned a place in the literature as a foundational reference for XR marketing research [2]. Further disseminating this vision, the team contributed a comprehensive book chapter on using XR technologies for consumer behavior analysis, effectively roadmapping the field’s future by summarizing current capabilities and pointing to emerging opportunities [21]. Such scholarly work has helped solidify the idea of “immersive analytics” in consumer research – treating VR/AR not just as gimmicks for consumer engagement, but as scientific instruments to advance our understanding of consumer psychology.

The innovations across the individual studies are numerous. To highlight a few: the demonstration that neurophysiological signals can predict market-level outcomes was a first-of-its-kind contribution that bridged laboratory consumer neuroscience with actual advertising success in the field [6]. It shifted the dialogue from “neuromarketing provides interesting insights” to “neuromarketing data can directly inform campaign decisions and outcome forecasts,” thereby greatly enhancing the perceived value of consumer neuroscience methods. The work on emotion recognition in VR proved that emotional states can be objectively measured in real time as people experience marketing content or retail environments [4]. This introduces the possibility of content that adapts to audience emotion – for example, a VR retail kiosk that senses frustration and offers help, or an entertainment experience that adjusts pacing if boredom is detected. Such adaptive, emotionally intelligent systems are a direct outgrowth of the capabilities this research has established. The behavioral biometrics studies (personality, impulsivity, demographics) represent a paradigm shift in personalization: rather than relying on what consumers say or have done before, one can personalize based on who they are, inferred from how they move and look around in a virtual space. This is a fundamentally new type of consumer insight made possible by AI and immersive tech working in tandem, and it opens avenues for “precision marketing” akin to precision medicine. Indeed, these advances might be seen as laying the groundwork for psychographic segmentation 2.0 – where segments are identified dynamically by behavioral patterns rather than broad demographics.

Beyond novelty, the impact on the academic community is reflected in how widely these works are cited and built upon (several of the papers have high citation counts for their young age, indicating that other researchers are seizing these ideas). The research on VR vs. real emotional responses [7][9] has informed debates on the validity of VR experiments, strengthening confidence in using VR for not just consumer research but psychology experiments in general. The advertising studies [6][14][17] have contributed to the nascent literature of what one might call “immersive advertising” or “neuro-advertising,” influencing subsequent studies that examine how attention and emotion in new media formats translate to persuasion. And within the field of consumer neuroscience, which is still relatively young, Alcañiz’s multidisciplinary approach – spanning cognitive psychology, physiology, computer science, and marketing – serves as a model for how to rigorously investigate the consumer experience holistically. His lab’s work often appears in top-tier journals of multiple fields (Psychology, Engineering, Marketing), exemplifying the cross-disciplinary impact. Notably, the methodologies developed (e.g., open-source algorithms for VR eye-tracking [10], integrated analysis pipelines for multimodal data) are enabling other researchers to adopt similar techniques more easily, thereby accelerating the field’s growth.

From an industry standpoint, these contributions are forward-looking yet practical. Companies experimenting with VR shopping or AR product demos often face the question: do these technologies genuinely improve customer outcomes or just add novelty? Thanks to this body of research, we have answers backed by data: yes, they can improve outcomes like engagement, memory, decision quality, and satisfaction – but only if designed with human behavior in mind (for instance, understanding the attention trade-offs of 360° content, or the optimal way to present info in AR without overload). The finding that presence mediates persuasion in VR marketing experiences [12] is particularly influential – it suggests that marketers should design for maximum immersion (visual, auditory, even olfactory) when they want to truly transport consumers and leave a lasting impact. Conversely, the insight about cognitive overload in 360° ads [14] serves as a caution that more interactivity isn’t always better in every dimension, refining how advertisers might use such formats (perhaps more for experiential brand building than for information-heavy messages). Even the more experimental findings, like the effect of human scent [1] or incongruent music [17], spark creative ideation in marketing teams about non-obvious ways to capture consumer attention. Already, we see experiential marketing campaigns that, for example, use ambient human crowd noise (a “sound” analog of social presence) or deliberately off-beat music in ads – tactics that resonate with the principles uncovered in LabLENI’s research.

Some related papers
  1. Alcañiz, M., Giglioli, I. A. C., Carrasco-Ribelles, L. A., Minissi, M. E., López, C. G., & Semin, G. R. (2023). How priming with body odors affects decision speeds in consumer behavior. Scientific Reports, 13(1), 609.
  2. Alcañiz, M., Bigné, E., & Guixeres, J. (2019). Virtual reality in marketing: a framework, review, and research agenda. Frontiers in Psychology, 10, 1530.
  3. Marín-Morales, J., Llinares, C., Guixeres, J., & Alcañiz, M. (2020). Emotion recognition in immersive virtual reality: From statistics to affective computing. Sensors, 20(18), 5163.
  4. Marín-Morales, J., Higuera-Trujillo, J. L., Greco, A., Guixeres, J., Llinares, C., Scilingo, E. P., … & Valenza, G. (2018). Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Scientific Reports, 8(1), 13657.
  5. Martínez-Navarro, J., Bigné, E., Guixeres, J., Alcañiz, M., & Torrecilla, C. (2019). The influence of virtual reality in e-commerce. Journal of Business Research, 100, 475–482.
  6. Guixeres, J., Bigné, E., Ausín Azofra, J. M., Alcañiz Raya, M., Colomer Granero, A., Fuentes Hurtado, F., & Naranjo Ornedo, V. (2017). Consumer neuroscience-based metrics predict recall, liking and viewing rates in online advertising. Frontiers in Psychology, 8, 1808.
  7. Marín-Morales, J., Higuera-Trujillo, J. L., Greco, A., Guixeres, J., Llinares, C., Gentili, C., … & Valenza, G. (2019). Real vs. immersive-virtual emotional experience: Analysis of psycho-physiological patterns in a free exploration of an art museum. PLOS ONE, 14(10), e0223881.
  8. Khatri, J., Marín-Morales, J., Moghaddasi, M., Guixeres, J., Giglioli, I. A. C., & Alcañiz, M. (2022). Recognizing personality traits using consumer behavior patterns in a virtual retail store. Frontiers in Psychology, 13, 752073.
  9. Marín-Morales, J., Higuera-Trujillo, J. L., Guixeres, J., Llinares, C., Alcañiz, M., & Valenza, G. (2021). Heart rate variability analysis for the assessment of immersive emotional arousal using virtual reality: Comparing real and virtual scenarios. PLOS ONE, 16(7), e0254098.
  10. Llanes-Jurado, J., Marín-Morales, J., Guixeres, J., & Alcañiz, M. (2020). Development and calibration of an eye-tracking fixation identification algorithm for immersive virtual reality. Sensors, 20(17), 4956.
  11. Colomer Granero, A., Fuentes-Hurtado, F., Naranjo Ornedo, V., Guixeres Provinciale, J., Ausín, J. M., & Alcañiz Raya, M. (2016). A comparison of physiological signal analysis techniques and classifiers for automatic emotional evaluation of audiovisual contents. Frontiers in Computational Neuroscience, 10, 74.
  12. Di Dalmazi, M., Mandolfo, M., Guixeres, J., Alcañiz Raya, M., & Lamberti, L. (2024). How immersive technologies impact behavioral responses in destination marketing: the role of physiological arousal, presence, and age. International Journal of Contemporary Hospitality Management, 36(11), 3628–3650.
  13. Bigné, E., Simonetti, A., Guixeres, J., & Alcañiz, M. (2024). Visual attention and product interaction: a neuroscientific study on purchase across two product categories in a virtual store. International Journal of Retail & Distribution Management, 52(4), 389–406.
  14. Ausín-Azofra, J. M., Bigné, E., Ruiz, C., Marín-Morales, J., Guixeres, J., & Alcañiz, M. (2021). Do you see what I see? Effectiveness of 360-degree vs. 2D video ads using a neuroscience approach. Frontiers in Psychology, 12, 612717.
  15. Gil-López, C., Guixeres, J., Marín-Morales, J., Torrecilla, C., Williams, E., & Alcañiz, M. (2023). Is mixed reality technology an effective tool for retail? A vividness and interaction perspective. Frontiers in Virtual Reality, 4, 1067932.
  16. Gil-López, C., Guixeres, J., Moghaddasi, M., Khatri, J., Marín-Morales, J., & Alcañiz, M. (2023). Recognizing shopper demographics from behavioral responses in a virtual reality store. Virtual Reality, 27(3), 1937–1966.
  17. Ausín, J. M., Bigné, E., Marín, J., Guixeres, J., & Alcañiz, M. (2021). The background music-content congruence of TV advertisements: A neurophysiological study. European Research on Management and Business Economics, 27(2), 100154.
  18. Moghaddasi, M., Marín-Morales, J., Khatri, J., Guixeres, J., Chicchi Giglioli, I. A., & Alcañiz, M. (2021). Recognition of customers’ impulsivity from behavioral patterns in virtual reality. Applied Sciences, 11(10), 4399.
  19. Gómez-Zaragozá, L., Marín-Morales, J., Parra, E., Guixeres, J., & Alcañiz, M. (2020). Speech Emotion Recognition from Social Media Voice Messages Recorded in the Wild. In HCI International 2020 – Posters (Vol. 22, pp. 330–336). Springer, Cham.
  20. Colomer, A., Naranjo, V., Guixeres, J., Rojas, J. C., Coret, J., & Alcañiz, M. (2015, January). Brain activity quantification for sport audiovisual content visualization using EEG. In Proceedings of the International Conference on Bio-inspired Systems and Signal Processing (Vol. 2, pp. 145–149). SCITEPRESS.
  21. Gil-López, C., Guixeres, J., Marín-Morales, J., & Alcañiz, M. (2022). Use of XR Technologies for Consumer Behavior Analysis. In Roadmapping Extended Reality: Fundamentals and Applications (pp. 283–308). Springer International Publishing.