『HCI Deep Dives』のカバーアート

HCI Deep Dives

HCI Deep Dives

著者: Kai Kunze
無料で聴く

概要

HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). AI-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.Copyright 2024 All rights reserved. 科学
エピソード
  • TAFFC 2025 Micro-Expressions: Could Micro-Expressions Be Quantified? Electromyography Gives Affirmative Evidence
    2026/01/27

    Micro-expressions are fleeting facial movements lasting just 40-200 milliseconds that are believed to reveal concealed emotions. But can these subtle expressions actually be measured objectively? This study provides the first direct electromyographic (EMG) evidence that micro-expressions are real, quantifiable muscle activations—not just visual artifacts. By placing electrodes on participants' faces while they attempted to suppress genuine emotions, researchers captured the electrical activity of facial muscles during both macro-expressions and micro-expressions. The results show that micro-expressions involve significantly smaller muscle contractions than regular expressions, explaining why they're so hard to detect visually. The findings also reveal that micro-expressions are truly involuntary "leakage"—participants couldn't fully suppress their emotional responses even when trying. This research has important implications for lie detection, clinical assessment, and understanding the fundamental nature of emotional expression.

    Jingting Li, Shaoyuan Lu, Yan Wang, Zizhao Dong, Su-Jing Wang, and Xiaolan Fu. 2024. Could Micro-Expressions Be Quantified? Electromyography Gives Affirmative Evidence. IEEE Transactions on Affective Computing, vol. 16, no. 4, 2024. https://doi.org/10.1109/TAFFC.2025.3575127

    続きを読む 一部表示
    14 分
  • TAFFC 2025 Music Emotion: Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges
    2026/01/20

    Music has long been known to evoke powerful emotions, but can machines truly understand and predict these emotional responses? This survey paper takes stock of the field of music emotion recognition (MER), examining the datasets, computational models, and persistent challenges that shape this research area. The authors review how emotion is represented—from categorical labels to dimensional models like valence-arousal—and analyze the most widely used datasets including the Million Song Dataset and MediaEval benchmarks. They trace the evolution from traditional machine learning approaches using hand-crafted audio features to modern deep learning architectures. Despite significant progress, the paper identifies fundamental challenges: the subjective nature of emotional responses to music, the difficulty of obtaining reliable ground truth labels, and the gap between controlled laboratory studies and real-world listening contexts.

    Jaeyong Kang and Dorien Herremans. 2024. Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges. IEEE Transactions on Affective Computing, vol. 16, no. 4, 2024. https://doi.org/10.1109/TAFFC.2025.3583505

    続きを読む 一部表示
    12 分
  • UIST 2025 Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch
    2026/01/16

    Virtual body extensions like wings or tails offer exciting new experiences in VR, but using them naturally—especially parts you can't see, like a tail—requires proprioceptive feedback to sense position and force without relying on vision. This paper introduces the "Imaginary Joint," a novel approach that uses skin-stretch feedback at the interface between your body and a virtual extension. A wearable device stretches and compresses skin on both sides of the waist to convey joint angle and torque from a virtual tail. The system simultaneously communicates both rotation and force by superimposing skin deformations. In controlled experiments, skin-stretch feedback significantly outperformed vibrotactile feedback in perceptual accuracy, sense of embodiment, and naturalness—with participants reporting the sensation felt remarkably like having an actual tail.

    Shuto Takashita, Jürgen Steimle, and Masahiko Inami. 2025. Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25), September 28–October 01, 2025, Busan, Republic of Korea. ACM, New York, NY, USA, 15 pages. https://doi.org/10.1145/3746059.3747800

    続きを読む 一部表示
    14 分
まだレビューはありません