{"id":1257,"date":"2025-05-07T17:20:42","date_gmt":"2025-05-07T09:20:42","guid":{"rendered":"https:\/\/tongjicdi.com\/?p=1257"},"modified":"2025-05-07T17:20:42","modified_gmt":"2025-05-07T09:20:42","slug":"chi-2025-cdi%e6%88%90%e5%91%98%e5%8f%82%e4%bc%9a%e6%88%90%e6%9e%9c%e9%80%9f%e8%a7%88","status":"publish","type":"post","link":"https:\/\/tongjicdi.com\/index.php\/2025\/05\/07\/chi-2025-cdi%e6%88%90%e5%91%98%e5%8f%82%e4%bc%9a%e6%88%90%e6%9e%9c%e9%80%9f%e8%a7%88\/","title":{"rendered":"CHI 2025 | CDI\u6210\u5458\u53c2\u4f1a\u6210\u679c\u901f\u89c8"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full\"><img fetchpriority=\"high\" decoding=\"async\" width=\"840\" height=\"675\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image.png\" alt=\"\" class=\"wp-image-1258\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image.png 840w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-300x241.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-768x617.png 768w\" sizes=\"(max-width: 840px) 100vw, 840px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"694\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-1-1024x694.png\" alt=\"\" class=\"wp-image-1259\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-1-1024x694.png 1024w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-1-300x203.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-1-768x521.png 768w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-1.png 1080w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>\u4eba\u673a\u4ea4\u4e92\u9886\u57df\u7684\u5e74\u5ea6\u5b66\u672f\u76db\u5bb4\u2014\u2014ACM CHI Conference on Human Factors in Computing Systems (\u7b80\u79f0CHI 2025) \u5df2\u4e8e5\u67081\u65e5\u5706\u6ee1\u843d\u5e55\u3002\u4f5c\u4e3a\u5168\u7403\u4eba\u673a\u4ea4\u4e92\u9886\u57df\u6700\u5177\u5f71\u54cd\u529b\u7684\u6807\u6746\u6027\u5b66\u672f\u4f1a\u8bae\u3001\u4e2d\u56fd\u8ba1\u7b97\u673a\u5b66\u4f1a\uff08CCF\uff09\u8ba4\u8bc1\u7684A\u7c7b\u4f1a\u8bae\uff0cCHI\u5728Core Conference Ranking\u4e2d\u4f4d\u5217A*\u7ea7\uff08flagship conference\uff09\uff0c\u5176\u5165\u9009\u8bba\u6587\u4e00\u76f4\u4ee5\u6765\u88ab\u516c\u8ba4\u4e3a\u5177\u6709\u5f88\u9ad8\u7684\u542b\u91d1\u91cf\u3002<strong>CDI\u6570\u5b57\u521b\u65b0\u4e2d\u5fc3<\/strong>&nbsp;\u6b64\u6b21\u5171\u6709<strong>4\u7bc7\u8bba\u6587\u5165\u9009<\/strong>\uff0c\u5305\u542b\u4eba\u7c7b-\u673a\u5668\u4eba\u4ea4\u4e92\u3001\u57fa\u4e8e\u53cd\u601d\u7684\u4e25\u8083\u6e38\u620f\u3001\u5bf9\u8bdd\u5f0f\u667a\u80fd\u4f53\u8bbe\u8ba1\u3001\u53ef\u6301\u7eed\u8bbe\u8ba1\u7b49\u591a\u4e2a\u7814\u7a76\u65b9\u5411\u3002\u4e0b\u9762\u662fCDI\u6210\u5458\u4eec\u5c55\u793a\u7684\u7cbe\u5f69\u7814\u7a76\u6210\u679c\u3002<\/p>\n\n\n\n<p><strong>01<\/strong><\/p>\n\n\n\n<p><strong>Papers<\/strong><\/p>\n\n\n\n<p>\u25cf GenComUI: Exploring Generative Visual Aids as Medium to Support Task-Oriented Human-Robot Communication &nbsp;<\/p>\n\n\n\n<p>\u25cf Walk in Their Shoes to Navigate Your Own Path: Learning About Procrastination Through A Serious Game<\/p>\n\n\n\n<p><strong>02<\/strong><\/p>\n\n\n\n<p><strong>Late-Breaking Work<\/strong><\/p>\n\n\n\n<p>\u25cf Align with Me, Not TO Me: How People Perceive Concept Alignment with LLM-Powered Conversational Agents<\/p>\n\n\n\n<p><strong>03<\/strong><\/p>\n\n\n\n<p><strong>Student Design Competition<\/strong><\/p>\n\n\n\n<p>\u25cf HabitAt: Bridging Humans and Wildlife toward a Sustainable Future<\/p>\n\n\n\n<p><strong>01&nbsp;<\/strong><strong>PAPERS<\/strong><\/p>\n\n\n\n<p><strong>GenComUI: Exploring Generative Visual Aids as Medium to Support Task-Oriented Human-Robot Communication<\/strong><\/p>\n\n\n\n<p><strong>Yate Ge<\/strong>,&nbsp;<strong>Meiying Li<\/strong>,&nbsp;<strong>Xipeng Huang<\/strong>,&nbsp;<strong>Yuanda Hu<\/strong>,&nbsp;<strong>Qi Wang<\/strong>, Xiaohua Sun, and<strong>&nbsp;Weiwei Guo\u2020<\/strong><\/p>\n\n\n\n<p><strong>Keywords &nbsp;<\/strong><strong>&nbsp;<\/strong>Human-Robot Interaction, Robot Programming, Service Robots, Conversational Interaction, Large Language Models, Generative UI<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"368\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-2-1024x368.png\" alt=\"\" class=\"wp-image-1260\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-2-1024x368.png 1024w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-2-300x108.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-2-768x276.png 768w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-2.png 1080w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><strong>Abstract&nbsp;<br><\/strong>This work investigates the integration of generative visual aids in human-robot task communication. We developed GenComUI, a system powered by large language models (LLMs) that dynamically generates contextual visual aids\u2014such as map annotations, path indicators, and animations\u2014to support verbal task communication and facilitate the generation of customized task programs for the robot. This system was informed by a formative study that examined how humans use external visual tools to assist verbal communication in spatial tasks. To evaluate its effectiveness, we conducted a user experiment (n = 20) comparing GenComUI with a voice-only baseline. The results demonstrate that generative visual aids, through both qualitative and quantitative analysis, enhance verbal task communication by providing continuous visual feedback, thus promoting natural and effective human-robot communication. Additionally, the study offers a set of design implications, emphasizing how dynamically generated visual aids can serve as an effective communication medium in human-robot interaction. These findings underscore the potential of generative visual aids to inform the design of more intuitive and effective human-robot communication, particularly for complex communication scenarios in human-robot interaction and LLM-based end-user development.<\/p>\n\n\n\n<p><a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3714238\">https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3714238<\/a><\/p>\n\n\n\n<p><strong>Walk in Their Shoes to Navigate Your Own Path: Learning About Procrastination Through A Serious Game<\/strong><\/p>\n\n\n\n<p><strong>Runhua Zhang<\/strong>, Jiaqi Gan,&nbsp;<strong>Shangyuan Gao<\/strong>,&nbsp;<strong>Siyi Chen<\/strong>,&nbsp;<strong>Xinyu Wu<\/strong>,&nbsp;<strong>Dong Chen<\/strong>,<strong>&nbsp;Yulin Tian<\/strong>,<strong>&nbsp;Qi Wang\u2020<\/strong>, and Pengcheng An\u2020<\/p>\n\n\n\n<p><strong>Keywords &nbsp;<\/strong>&nbsp;Procrastination, Serious Games, Learning, Reflection<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"317\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-3-1024x317.png\" alt=\"\" class=\"wp-image-1261\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-3-1024x317.png 1024w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-3-300x93.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-3-768x238.png 768w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-3.png 1080w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><strong>Abstract&nbsp;<br><\/strong>Procrastination, the voluntary delay of tasks despite potential negative consequences, has prompted numerous time and task management interventions in the HCI community. While these interventions have shown promise in addressing specific behaviors, psychological theories suggest that learning about procrastination itself may help individuals develop their own coping strategies and build mental resilience. However, little research has explored how to support this learning process through HCI approaches. We present ProcrastiMate, a text adventure game where players learn about procrastination\u2019s causes and experiment with coping strategies by guiding in-game characters in managing relatable scenarios. Our field study with 27 participants revealed that ProcrastiMate facilitated learning and self-reflection while maintaining psychological distance, motivating players to integrate newly acquired knowledge in daily life. This paper contributes empirical insights on leveraging serious games to facilitate learning about procrastination and offers design implications for addressing psychological challenges through HCI approaches.<\/p>\n\n\n\n<p><a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3715271\">https:\/\/dl.acm.org\/doi\/10.1145\/3706598.3715271<\/a><\/p>\n\n\n\n<p><strong>02&nbsp;<\/strong><strong>LATE-BREAKING WORK<\/strong><\/p>\n\n\n\n<p><strong>Align with Me, Not TO Me: How People Perceive Concept Alignment with LLM-Powered Conversational Agents<\/strong><\/p>\n\n\n\n<p><strong>Shengchen Zhang<\/strong>,&nbsp;<strong>Weiwei Guo\u2020<\/strong>, and Xiaohua Sun<\/p>\n\n\n\n<p><strong>Keywords &nbsp;<\/strong>&nbsp;Concept Alignment, Grounding, Conversational Agents, Large Language Models, Human-Agent Interaction<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"683\" height=\"603\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-4.png\" alt=\"\" class=\"wp-image-1262\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-4.png 683w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-4-300x265.png 300w\" sizes=\"(max-width: 683px) 100vw, 683px\" \/><\/figure>\n\n\n\n<p><strong>Abstract&nbsp;<br><\/strong>Concept alignment\u2014building a shared understanding of concepts\u2014is essential for human and human-agent communication. While large language models (LLMs) promise human-like dialogue capabilities for conversational agents, the lack of studies to understand people\u2019s perceptions and expectations of concept alignment hinders the design of effective LLM agents. This paper presents results from two lab studies with human-human and human-agent pairs using a concept alignment task. Quantitative and qualitative analysis reveals and contextualizes potentially (un)helpful dialogue behaviors, how people perceived and adapted to the agent, as well as their preconceptions and expectations. Through this work, we demonstrate the co-adaptive and collaborative nature of concept alignment and identify potential design factors and their trade-offs, sketching the design space of concept alignment dialogues. We conclude by calling for designerly endeavors on understanding concept alignment with LLMs in context, as well as technical efforts to combine theory-informed and LLM-driven approaches.<\/p>\n\n\n\n<p><a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3706599.3720126\">https:\/\/dl.acm.org\/doi\/10.1145\/3706599.3720126<\/a><\/p>\n\n\n\n<p><strong>03&nbsp;<\/strong><strong>STUDENT DESIGN COMPETITION<\/strong><\/p>\n\n\n\n<p><strong>HabitAt: Bridging Humans and Wildlife toward a Sustainable Future<\/strong><\/p>\n\n\n\n<p><strong>Yu-chieh Cheng*<\/strong>,&nbsp;<strong>Zixuan Zhang*<\/strong>, and&nbsp;<strong>Huiting Huang*<\/strong><\/p>\n\n\n\n<p><strong>Keywords &nbsp;<\/strong>&nbsp;Sustainable Cities and Communities, Human-Wildlife Coexistence, Participatory Workshop, Empathy-Driven Interaction Design<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"886\" height=\"570\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-5.png\" alt=\"\" class=\"wp-image-1263\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-5.png 886w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-5-300x193.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-5-768x494.png 768w\" sizes=\"(max-width: 886px) 100vw, 886px\" \/><\/figure>\n\n\n\n<p><strong>Abstract&nbsp;<br><\/strong>Urban pollution poses significant challenges to both human and wildlife health, necessitating innovative approaches to promote sustainable coexistence. This study explores the potential of human-animal interaction as a lens to foster environmental awareness and empathy. We propose HabitAt, a conceptual design prototype that leverages animals\u2019 superior sensory capabilities to detect urban pollution, enabling a deeper understanding of environmental conditions while encouraging sustainable behavior. To inform the design, we conducted a participatory workshop where participants engaged in role-playing activities to experience urban environments from multiple angles. Observations of the workshop were synthesized into HabitAt\u2019s design, which integrates ecological data visualization and interactive elements to create deeper human-animal connections and provoke daily actions in environmental protection. Moving forward, we will refine app features, resolve technical challenges, and expand its applications to better support sustainable urban development.<\/p>\n\n\n\n<p><a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3706599.3720313\">https:\/\/dl.acm.org\/doi\/10.1145\/3706599.3720313<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"508\" src=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-6-1024x508.png\" alt=\"\" class=\"wp-image-1264\" srcset=\"https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-6-1024x508.png 1024w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-6-300x149.png 300w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-6-768x381.png 768w, https:\/\/tongjicdi.com\/wp-content\/uploads\/2025\/05\/image-6.png 1080w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>\u4eba\u673a\u4ea4\u4e92\u9886\u57df\u7684\u5e74\u5ea6\u5b66\u672f\u76db\u5bb4\u2014\u2014ACM CHI Conference on Human Factors in C [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[22],"tags":[],"class_list":["post-1257","post","type-post","status-publish","format-standard","hentry","category-22"],"_links":{"self":[{"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/posts\/1257","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/comments?post=1257"}],"version-history":[{"count":1,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/posts\/1257\/revisions"}],"predecessor-version":[{"id":1265,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/posts\/1257\/revisions\/1265"}],"wp:attachment":[{"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/media?parent=1257"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/categories?post=1257"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/tongjicdi.com\/index.php\/wp-json\/wp\/v2\/tags?post=1257"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}