...

/projects/

Here are some samples of my research projects. Check my GitHub and Google Scholar for more.

Publications

Publication · IEEE Transactions on Affective Computing

TouchTales: Recognizing Naturalistic Emotions Through Touch in Storytelling

2026

Rúbia Guerra, Laura Cang, et al. (2026)

A care-centered protocol combining autobiographical storytelling with a touch-sensitive object to elicit and label authentic emotions, with ML results reaching accuracy comparable to physiological models. This is my first PhD publication; I co-led the work with Laura, designing the session structure, developing the touch data analysis protocol, and co-writing the ethical and methodological framing for sustainable affective data collection, including the discussion and reflection sections.

affective-computing haptics human-emotion annotation-methods protocol-design research-ethics
Publication · ACM REP

Raising the Reproducibility Bar

2025

Joseph Wonsil, Rúbia Guerra, et al. (2025)

Examines challenges in computational reproducibility and proposes a taxonomy and workflow for creating artifacts that support not only re-execution but also comprehension and scrutiny, including integration with tooling and UX studies. I contributed to defining the conceptual space, shaping the design of artifact structures, and analysing how documentation and tool choices affect researchers trying to understand and reuse computational experiments.

reproducibility interaction-design research-ethics user-study
Publication · IEEE ACII

Modeling the 'Kiss my Ass'-Smile: Appearance and Functions of Smiles in Negative Social Situations

2024

Mirella Hladký, Rúbia Guerra, et al. (2024)

Builds a corpus of smiles in negative social situations, proposes a functional category system, and trains models that classify smile functions from morphological signals at above-chance performance. Mirella was the primary researcher; I contributed beyond data collection, modelling smile functions from visual features and interpreting how mismatches between expression and internal emotion challenge automatic emotion recognition.

affective-computing human-emotion social-emotion dataset-design ml-modeling
Publication · ACM UIST

What is Affective Touch Made Of? A Soft Capacitive Sensor Array Reveals the Interplay between Shear, Normal Stress and Individuality

2024

Devyani McLaren, Jian Gao, Xiuliu Yin, Mirella Hladký, Rúbia Guerra, et al. (2024)

Uses a soft capacitive sensor array to measure shear and normal stress during expressive touch, revealing how individual differences shape the tactile signatures of affective gestures and informing haptic sensing and touch-aware interface design. I worked on feature design and data analysis beyond data collection, performing statistical and ML analyses that relate mechanical patterns of touch to affective interpretation, and helped situate the findings within affective computing and haptic interaction.

affective-computing haptics behavioural-sensing mixed-methods sensing-hardware
Publication · IEEE Transactions on Haptics

Feeling (key)Pressed: Comparing The Ways in Which Force and Self-Reports Reveal Emotion

2023

Laura Cang, Rúbia Guerra, et al. (2023)

Explores how keyboard typing force during stressful videogame play reflects emotional state, comparing force measurements to self-reported affect and EEG to propose a passive, unobtrusive sensing approach to emotion recognition. I led the ML modelling and statistical analyses comparing force and EEG (EEG modelling by Bereket Guta) against multi-pass labels, and helped interpret how these results connect haptic interaction, affective computing, and behaviour-aware systems. Published in IEEE Transactions on Haptics; invited for presentation at the World Haptics Conference 2025.

affective-computing haptics behavioural-sensing ml-modeling time-series
Publication · IEEE ACII

Choose or Fuse: Enriching Data Views with Multi-label Emotion Dynamics

2022

Rúbia Guerra, et al. (2022)

Addresses how to represent emotion as a dynamic multi-label phenomenon during interactive gameplay, comparing strategies that choose between or fuse multiple views of self-reported affect to enrich emotion recognition models. I helped conceptualize and statistically analyse these different label views and contributed to the discussion of how these choices affect the robustness of emotion prediction over time.

affective-computing human-emotion annotation-methods ml-modeling time-series
Publication · IEEE World Haptics Conference

Crescendo: Haptic Exploration of Scores for Novice Musicians with Dyslexia

2021

Rúbia Guerra, Hannah Elbaggari, Sabrina Knappe, Juliette Regimbal (2021)

Introduces a multimodal system using haptic feedback via a pantograph device and auditory cues to help novice musicians with dyslexia explore and learn musical notation, with a preliminary user evaluation with dyslexic participants. I co-prototyped the haptic interactions on the Haply force-feedback device with Hannah, designing how notes and bars are rendered through force cues, and contributed to the interaction design, study, and writing around accessibility for dyslexic learners.

accessibility education haptics interaction-design user-study
Publication · Expert Systems with Applications

Automatic Translation of Sign Language with Multi-stream 3D CNN and Generation of Artificial Depth Maps

Rúbia Guerra, et al.

Combines RGB video, artificially generated depth maps, and motion features to recognize signs in Brazilian, Indian, and Korean sign language datasets, improving accuracy without requiring depth sensors. This work was primarily led by Giulia Zanon; I contributed to designing the ML models and discussing implications for accessible, AI-driven communication technology under resource constraints.

accessibility sign-language computer-vision deep-learning ml-modeling
Publication · SBAI

Development of a Database of Libras' Signs for Machine Learning: 3DCNN Case Study

2019

Giulia Zanon, Rúbia Guerra, et al. (2019)

Describes the recording of a new Brazilian Sign Language dataset with 20 signs produced by 10 signers, and evaluates a 3D CNN achieving around 72.6% accuracy as a baseline for future work. I contributed to the ML study, including data processing and 3D CNN experiments, and helped interpret what this early dataset and accuracy level mean for subsequent Libras recognition research.

accessibility sign-language dataset-design deep-learning
PDF PT-BR
Publication · ENIAC

Facial Expression Analysis in Brazilian Sign Language for Sign Recognition

2018

Rúbia Guerra, Tamires Rezende, Frederico Gadelha, et al. (2018)

Investigates approaches to facial feature extraction for Libras recognition and compares Random Forests to SVM and k-NN, showing that RF can match or exceed the performance of these methods. I worked on modelling and feature extraction for facial expressions and helped interpret the results as evidence that non-manual parameters are a consistent, technically viable component for sign language recognition systems.

accessibility sign-language social-emotion dataset-design
Publication · II Congresso de Inovação e Metodologias de Ensino

Analysis of Quality of Life of Electrical Engineering Students

2016

Lucas Chaves, Mateus Rodrigues, Rúbia Guerra, et al. (2016)

Analyses how academic workload, learning environment, and personal factors relate to reported well-being among undergraduate students in an engineering program. I contributed primarily to interpreting the findings, writing the discussion, and presenting the work, highlighting links between quantitative results and students' lived experiences.

education student-wellbeing mixed-methods user-study
PDF PT-BR

MSc Thesis

MSc Thesis · University of British Columbia

Recognizing Naturalistic Emotions Through Touch

2024

Supervised by Prof. Karon MacLean

Uses the FEEL dataset to compare keyboard force and EEG for modelling dynamic emotions during horror game play, develops strategies for handling multi-pass self-reports, and shows that force signals can outperform EEG for tracking emotion dynamics. I designed and executed the complete analysis pipeline, from data processing and model development to evaluation, and articulated implications for building richer, temporally sensitive affective models grounded in real interaction.

affective-computing haptics human-emotion mixed-methods ml-modeling time-series

BSc Thesis

BSc Thesis · UFMG

Deep Learning for Accessibility: Detection and Segmentation of Regions of Interest for Sign Language Recognition Systems

2019

Supervised by Prof. Frederico Gadelha and Dr. Tamires Rezende

Investigates detection and segmentation of regions of interest to support sign language recognition systems, using deep models to focus on relevant parts of the visual scene. I designed the models, carried out the experiments, and analysed how different region-of-interest strategies affect recognition performance in the context of assistive technologies for sign language users.

accessibility sign-language computer-vision deep-learning ml-modeling