Filters

Year
Product

Fields


1-10 of 1102 publications

Eye candy & eye tunes: Effects of liked vs. disliked music on desire to eat and food choice in an eye-tracking buffet paradigm

2026Cognitive PsychologyInvisible
Jonas Potthoff; Anne SchienleAppetite
Music can evoke both positive and negative moods, which may, in turn, differently affect the processing of food cues. This preregistered eye-tracking study investigated whether self-selected liked versus disliked music affects desire to eat, visual attention to foods of varying sugar content, and subsequent food choice in a buffet-like context. A total of 106 participants (mean age = 25 years; mean body mass index = 22 kg/m2) viewed a buffet with high-sugar foods, low-sugar alternatives, and non-foods while eye movements were recorded. Participants were randomly assigned to a liked music, disliked music, or no music condition. Self-reported desire to eat and food choice were assessed. Disliked music decreased general desire to eat but increased the specific desire to eat high-sugar food. Furthermore, it increased the likelihood of selecting high-sugar foods from the buffet. Liked music and no music were associated with a preference for low-sugar foods. Music did not significantly influence visual attention. Participants consistently looked longer at food than non-food items regardless of their music condition. These findings suggest that music can bias food-related decision-making independently of attentional processes: liked music may encourage healthier choices, whereas disliked music increases susceptibility to high-sugar comfort foods despite reduced general appetite. The results highlight the potential of music as a subtle, non-caloric intervention for promoting low-sugar eating behaviour. They also point towards risks of being exposed to disliked music in contexts in which food decisions are being made like in restaurants or supermarkets.

Curious yet disgusted: A mobile eye-tracking investigation of visual attention to insect-based snacks in a buffet setting

2026Cognitive PsychologyInvisible
Jonas Potthoff; Maya Gumussoy; Anne Schienle; Edwin S. DalmaijerFood Quality and Preference
In Western societies, many people are unfamiliar with insect-based foods and reject them, despite their promise as a sustainable alternative to conventional animal protein. This mobile eye-tracking study examined how people view and evaluate insect-based foods in a buffet setting. Thirty-seven participants (mean age = 26 years) freely viewed a buffet containing 12 items from four categories: insect-based snacks, novel non-insect snacks, familiar snacks, and non-food objects. Mobile eye-tracking measured total and mean fixation durations for each item. Participants also rated each food item on disgust and desire to eat. The findings show that insect-based and novel snacks were viewed significantly longer than familiar snacks and non-foods, indicating increased visual engagement rather than oculomotor avoidance. Mean fixation duration did not differ across categories. Insect-based snacks elicited significantly higher disgust and lower desire to eat than both novel and familiar snacks. In conclusion, despite high disgust and low desire to eat, insect-based snacks attracted more visual attention than familiar foods and non-foods. This suggests that food disgust is not associated with oculomotor avoidance which is commonly observed when disgust is elicited by non-food.

Cognitive load in AR-Supported indoor wayfinding performance: A correlation study

2026VR/ARCore
Fang Xu; Tianyu Zhou; Hengxu You; Jiahao Wu; Scott Ledgerwood; Jing DuDevelopments in the Built Environment
Wayfinding is critical for firefighters under high cognitive demands and hazardous conditions. We introduce a real-time cognitive monitoring system combining AR goggles with stabilized pupillometry. To correct errors from headset motion and varying camera-eye distances, we developed a pupil-to-iris diameter ratio method as a stable reference. Implemented on HoloLens 2 with an infrared eye-tracking camera, the system provides robust cognitive load estimates. Thirty firefighters completed a SAR simulation while their pupillary responses and wayfinding performance were recorded. Results revealed a significant negative correlation between cognitive load and wayfinding performance (r = −0.397, p < 0.001), and logistic regression identified a critical load threshold (0.48) beyond which performance declined sharply. This study demonstrates the feasibility of using real-time cognitive monitoring with AR wayfinding support for first responders. Stabilized pupillometry and cognitive load-performance correlation pave the way for AR systems that enhance firefighters’ situational awareness and decision-making under stress.

Pupillometry and Brain Dynamics for Cognitive Load in Working Memory

2026Machine LearningCore
Nusaibah Farrukh; Malavika Pradeep; Akshay Sasi; Rahul Venugopal; Elizabeth SherlyarXiv
Cognitive load, the mental effort required during working memory, is central to neuroscience, psychology, and human-computer interaction. Accurate assessment is vital for adaptive learning, clinical monitoring, and brain-computer interfaces. Physiological signals such as pupillometry and electroencephalography are established biomarkers of cognitive load, but their comparative utility and practical integration as lightweight, wearable monitoring solutions remain underexplored. EEG provides high temporal resolution of neural activity. Although non-invasive, it is technologically demanding and limited in wearability and cost due to its resource-intensive nature, whereas pupillometry is non-invasive, portable, and scalable. Existing studies often rely on deep learning models with limited interpretability and substantial computational expense. This study integrates feature-based and model-driven approaches to advance time-series analysis. Using the OpenNeuro 'Digit Span Task' dataset, this study investigates cognitive load classification from EEG and pupillometry. Feature-based approaches using Catch-22 features and classical machine learning models outperform deep learning in both binary and multiclass tasks. The findings demonstrate that pupillometry alone can compete with EEG, serving as a portable and practical proxy for real-world applications. These results challenge the assumption that EEG is necessary for load detection, showing that pupil dynamics combined with interpretable models and SHAP based feature analysis provide physiologically meaningful insights. This work supports the development of wearable, affordable cognitive monitoring systems for neuropsychiatry, education, and healthcare.

Fixations, blinks, and pupils differentially capture individual and interpersonal dynamics in role-asymmetric mutual gaze interaction

2026Social PsychologyNeon
Mehtap Çakır; Anke HuckaufScientific Reports
Although eye cues have proven effective in simulated gaze contact, it remains unclear how and through which eye parameters people interpret and use such cues in real interactions. We developed a real-time dyadic paradigm that restricted interaction to the eye region and incorporated asymmetrical roles and temporally structured interaction phases. One partner (listener) experienced emotion-inducing sounds, while the other (observer), unaware of the timing or content, attempted to infer the listener’s emotions solely from eye cues. Using a multi-measure approach, we analyzed fixation, blink, and pupil parameters in 25 dyads. Results showed that the parameters were shaped primarily by role- and phase-related processing demands rather than emotional valence. Blinks indexed role-specific processing demands, adapting to attentional priorities. Interpersonal blink synchronization decreased when partners’ attentional goals diverged, underscoring its dependence on attentional coupling. Fixations reflected shared attention allocation across roles, marked by active visual exploration during mutual gaze phases. Pupil dilation signaled phase-dependent arousal and cognitive effort, particularly for observers. Together, these findings reveal differential sensitivity across eye parameters, extending from attention allocation to social cognition and interpersonal coordination, and highlight the need for multi-measure frameworks to model the eyes as an integrated system for real-time social communication.

Exploratory eye movement patterns in schizophrenia and their potential as biomarker

2026Cognitive PsychologyCore
Gizem Dogancali Yanchev; Petranka Chumpalova-Tumbeva; Ivanka Veleva; Kaloyan Stoichev; Snejena Murgova; Georgi Balchev; Emiliya Dimitrova-Ilieva; Aleksandar Todorov; Stanislav KapinchevJournal of Biomedical and Clinical Research
Eye movements represent an objective, quantitative indicator of the integrity of cognitive and perceptual processes. Since the beginning of the 20th century, research has shown that individuals with schizophrenia demonstrate characteristic deviations in smooth pursuit, -egulation, and visual exploration behaviour. These oculomotor alterations are linked to dysfunctions in attention, executive control, and perceptual organization, and have been discussed as potential biomarkers of impaired neural network regulation. More recent work further implicates disturbances in visual attention and integration, as well as reduced executive control over oculomotor activity, which may manifest in altered gaze behaviour.Building on this literature, the present study examined whether free-viewing eye-movement patterns capture markers of restricted exploration and altered scanpath organization under ecologically valid conditions. We recorded eye movements during passive viewing of landscape and abstract images in three groups: patients with schizophrenia (n = 30), healthy controls (n = 30), and close relatives (n = 21). Visual exploration was quantified using integrative indices of fixation number and duration (mean/median/total), scanpath length, coverage fraction, spatial dispersion (mean/median; dispersion_x and dispersion_y), center bias, fixations per second, gaze entropy (bits), and saccade metrics. Group and image type were tested in 2 &times; 2 mixed ANOVA models with FDR correction across metrics.In the patient-control analysis (N = 60), a significant main effect of group was observed across multiple exploration and oculomotor parameters after FDR correction (partial &eta;p&sup2; &asymp; .12&ndash;.21; pFDR &le; .032), with no main effect of image type and no group &times; image type interaction surviving correction, supporting a stimulus-nonspecific alteration of visual exploration in schizophrenia. In the relatives-controls analysis (N = 51), uncorrected trends suggested a more compact scan pattern (reduced dispersion and saccade amplitude). However, no effects remained significant after FDR correction. Overall, free-viewing eye-movement metrics showed medium-to-large, stimulus-nonspecific group differences in schizophrenia, consistent with a restricted and altered exploration mode, whereas potential vulnerability-related signals in first-degree relatives were weaker and did not survive correction, indicating the need for larger samples and/or targeted paradigms with predefined core metrics in familial-risk designs.

Soccer players’ ability to use peripheral vision is affected by viewing angles but not crowding

2026Sports ScienceCore
Christian Vater; Svitlana Pinchuk; Božo VukojevićCurrent Issues in Sport Science (CISS)
Introduction: Peripheral vision plays a key role in monitoring multiple players in team sports (Vater et al., 2020; Vater, 2024). In soccer, highly skilled players anchor their gaze on the player in possession while using peripheral vision to monitor surrounding movements in 3 vs. 3 situations (Vater et al., 2019). This study examined how peripheral vision is challenged in a counter-attack scenario by varying the viewing angle between a defender’s direct opponent and a wing striker. We also manipulated crowding, a factor known to limit peripheral perception (Herzog &amp; Sayim, 2022). We predicted that larger viewing angles and higher crowding would impair overrun detection. Methods: Twenty-two participants (M_age = 20.81 years, SD = 1.81, n_female = 10) with at least 5 years of experience as central defenders or midfielders viewed 3 vs. 3 virtual-reality soccer counterattacks from a central-defender perspective. Their task was to detect when a wing striker overran his defender—on the right, left, or both sides—and indicate this with a hand response. Detection performance served as a measure of peripheral-vision use. Participants simultaneously defended their direct opponent, who could shoot or pass to either striker. A within-subjects 3 (Eccentricity: 20°, 40°, 60°) × 3 (Crowding: 0.25, 0.50, 0.75 Bouma) design was used. Dependent variables included overrun-detection accuracy and response time as well as peripheral detection. Body kinematics were captured with a 14-camera Optitrack system; gaze was recorded with a Pupil Labs Core eye tracker at 120 Hz. Normality and homogeneity were tested for each dependent measure. Parametric tests were used when assumptions were met; otherwise, the Kruskal-Wallis H test was applied. Bonferroni corrections were used for post hoc tests. Effect sizes were reported as η² or ε², with Cohen’s d for pairwise comparisons. Results: Eccentricity significantly affected overrun-detection accuracy, H(2) = 29.21, p &lt; .001, η² = 0.257. Accuracy was higher at 20° (M = 82.24%) and 40° (M = 83.29%) than at 60° (M = 71.69%), both p &lt; .001. Response times also showed a main effect of eccentricity, H(2) = 29.23, p &lt; .001, η² = 0.446, with faster responses at 20° than 40° or 60°, and at 40° than 60°. Peripheral detection showed the same pattern: H(2) = 23.20, p &lt; .001, η² = 0.372; participants relied more on peripheral vision at 40° and 60° than at 20°. No effects of crowding were found for any measure (all p &gt; .30). Discussion/Conclusion: Viewing angle strongly influences performance in this sport-specific overrun-detection task, with larger eccentricities reducing accuracy and slowing responses. Although participants used peripheral vision frequently at 40° and 60°, their reduced accuracy and slower responses at 60° suggest that initiating a saccade to peripheral locations may be beneficial (Vater et al., 2020). Contrary to expectations, crowding did not affect performance, possibly because we used sport-specific dynamic rather than artificial, static stimuli or because participants were not restricted to only using peripheral vision. References Herzog, M. H., &amp; Sayim, B. (2022). Crowding: Recent advances and perspectives. Journal of Vision, 22(12), 15. https://doi.org/10.1167/jov.22.12.15 Vater, C. (2024). Viewing angle, skill level and task representativeness affect response times in basketball defence. Scientific Reports, 14(1), 3337. https://doi.org/10.1038/s41598-024-53706-9 Vater, C., Williams, A. M., &amp; Hossner, E.-J. (2020). What do we see out of the corner of our eye? The role of visual pivots and gaze anchors in sport. International Review of Sport and Exercise Psychology, 13(1), 81–103. https://doi.org/10.1080/1750984X.2019.1582082       Vater, C., Luginbühl, S. P., &amp; Magnaguagno, L. (2019). Testing the functionality of peripheral vision in a mixed-methods football field study. Journal of Sports Sciences, 37(24), 2789-2797. https://doi.org/10.1080/02640414.2019.1664100

Examining micro-level natural behaviour to improve generalizability in behavioural science: a case study of parent–child joint attention

2026Developmental PsychologyCore, Invisible, Neon
Chen Yu; Brianna Kaplan; Sara Schroer; Mary HayhoePhilosophical Transactions of the Royal Society B: Biological Sciences
One of the primary aims of cognitive and behavioural sciences is to generate empirical findings that are both reproducible and generalizable to real-world settings. The present study investigates the extent to which results obtained from a structured laboratory task—parent–infant toy play—can be generalized to more naturalistic contexts and everyday activities. We focused on joint attention between parents and infants, a construct that has been extensively examined within developmental science. To characterize parent–infant joint attention during toy play, we recorded and analysed contingent gaze behaviour captured through dual head-mounted eye-tracking devices worn simultaneously by parents and their infants during spontaneous activities such as toy play and meal preparation. By continuously monitoring gaze locations and manual actions, we obtained fine-grained measures of how often dyads fixated on the same object concurrently and how their coordinated visual and manual behaviours contributed to the establishment and maintenance of joint attention. Our results suggest that laboratory findings can be both replicated and generalized when: (i) the study is designed to capture natural behaviours rather than to elicit specific, constrained responses, and (ii) the theoretical constructs are clearly defined and precisely measured through high-resolution behavioural data.This article is part of the theme issue ‘Mechanisms of learning from social interaction’.

From Walkability to Everyday Serendipity: Rethinking the Sustainable Revitalization of Local Shopping Street in Asian Second-Tier Cities

2026Architecture & DesignCore
Qing Wang; Yuki Tanaka; Chika TakatoriSocial Science Research Network
Local shopping streets have long served as everyday social infrastructures in dense Asian second-tier cities, yet they increasingly struggle to retain vitality amid retail restructuring and large-scale redevelopment. While conventional walkability research explains how people can walk, it rarely addresses why they choose to explore, pause, and discover.This study advances an experiential turn in urban walkability by conceptualizing and operationalizing Walking Serendipity—the micro-moments of curiosity, pleasure, and spontaneous encounter that sustain neighborhood life. Using Fukuoka as a representative second-tier metropolis in Japan, we develop a three-layer spatial–perceptual–affective framework integrating street-morphology metrics, wearable eye-tracking, and affective assessment.Findings from cluster analysis, analysis of variance (ANOVA), and multiple regression reveal a sequential mechanism from spatial diversity through attentional release and emotional arousal to exploratory behavior. Two complementary pathways of walking serendipity are identified: high-footfall, event-driven streets and fine-grained, discovery-driven alley networks. Greenery, transparent façades, pedestrian activity, and the availability of seating consistently enhanced walking serendipity, whereas excessive visual clutter and ground-level distractions—such as inconsistent paving patterns and utility poles—were found to suppress it.Furthermore, a “Daily Serendipity Street” toolkit of low-cost micro-interventions is proposed, including soft landscapes, interface transparency, local cues, and micro-stay nodes, offering an alternative to capital-intensive urban renewal. As a result, the study reconceptualizes walkability as perceptual–emotional infrastructure rather than mere mobility infrastructure, positioning serendipity as a measurable dimension for low-cost and sustainable neighborhood revitalization in walkable Asian second-tier metropolises.

A Comparison of Centroid Tracking and Image Phase for Improved Optokinetic Nystagmus Detection

2026OpthalmologyNeon
Jason Turuwhenua; Mohammad Norouzifard; Zaw LinTun; Misty Edmonds; Rebecca Findlay; Joanna Black; Benjamin ThompsonJournal of Eye Movement Research
Optokinetic nystagmus (OKN) is an involuntary sawtooth eye movement that occurs in the presence of a drifting stimulus. Our experience is that low-amplitude/short-duration OKN can challenge the limits of our commercially available Pupil Neon eye-tracker, leading to false negative OKN detection results. We sought to investigate whether such instances could be remediated. We compared automated OKN detection using: (1) the gaze signal from the Pupil Neon (OKN-G), (2) centroid tracking (OKN-C), and (3) an image-phase-based “motion microscopy” technique (OKN-MMIC). The OKN-C and OKN-MMIC methods were also tested as a remediated step after a negative OKN-G result (OKN-C-STEP, OKN-MMIC-STEP). To validate the approaches adults (n = 22) with normal visual acuity was measured whilst viewing trials of an OKN induction stimulus shown at four levels of visibility. Confusion matrices and performance measures were determined for a “main” dataset that included all methods, and a “retest” set, which contained instances where centroid tracking failed. For the main set, all tested methods improved upon OKN-G by Matthew’s correlation coefficient (0.80–0.85 vs. 0.76), sensitivity (0.89–0.95 vs. 0.85), and accuracy (0.91–0.93 vs. 0.88); but only OKN-C yielded better specificity (0.90–0.96 vs. 0.95). For the retest set, MMIC and MMIC-STEP methods consistently improved upon the performance of OKN-G across all measures.