特教新闻
特教新闻
2026年2月20日,CHI公布了2026年论文录用结果,我院师生完成的论文《ArtfulSign: A Closed-Loop, Semantics-Grounded Mobile System for Learning Chinese Sign Language》(第一作者为机器人学院(人工智能学院)硕士研究生赵源,通讯作者为特殊教育学院姚登峰教授、甄玮)被国际人机交互领域顶级会议 CHI 2026(ACM Conference on Human Factors in Computing Systems)Interactive Demos 单元录用。
CHI 会议由国际计算机学会(ACM)主办,是人机交互领域的旗舰会议之一,也是中国计算机学会(CCF)推荐的 A类国际会议。2026 年 CHI Interactive Demos 单元共收到 323 篇投稿,最终录用率为 21%,竞争激烈。
该研究提出了一款面向中国手语学习的移动系统,围绕“闭环式(closed-loop)练习”与“语义支撑式学习(semantics-grounded learning)”的交互设计理念展开,通过实时反馈机制,将“学习—练习—纠错—再练习”组织为可重复的交互闭环,支持具身技能(embodied skill)的自主学习。系统已在真实环境中部署,并上线应用商店,具有良好的实践基础。本次入选 Interactive Demos 单元,研究成果将在 CHI 2026 会议期间进行现场展示与交流。
近年来,团队围绕人工智能与信息无障碍方向持续开展研究工作,积极探索人工智能技术在特殊教育与无障碍环境建设中的应用。本次成果入选 CHI,标志着团队在以人为中心的人工智能与人机交互交叉领域取得新的进展。
论文信息如下:
ArtfulSign: A Closed-Loop, Semantics-Grounded Mobile System for Learning Chinese Sign Language
YUAN ZHAO, YUERAN WANG, CHENGLONG TAN, SIYANG TONG, DENGFENG YAO†, WEI ZHEN†
Abstract:Learning Chinese Sign Language (CSL) is fundamentally an embodied skill learning process that requires repeated physical practice and timely feedback, yet most existing tools remain largely open-loop and focus on passive content consumption. We explore how a closed-loop, semantics-grounded interaction design can better support independent CSL learning, and present ArtfulSign, a mobile system that integrates real-time, on-device recognition feedback with meaning-oriented explanations and lightweight situated scenarios. ArtfulSign organizes learning into short practice loops in which learners perform a sign, receive immediate feedback, and iteratively refine their performance. We will demonstrate ArtfulSign as an interactive CHI demo, where attendees can experience closed-loop embodied practice and learn a CSL sign within minutes.