User Interface
User interface (UI) research focuses on designing and improving how humans interact with digital systems, aiming to enhance usability, accessibility, and overall user experience. Current research emphasizes the use of large language models (LLMs) and multimodal models, often incorporating computer vision techniques, to understand and generate UI elements, automate tasks, and even predict usability. This work is significant because it promises to improve accessibility for diverse user groups, increase efficiency in various applications (e.g., healthcare, robotics), and provide valuable insights into human-computer interaction for both researchers and designers.
Papers
DreamStruct: Understanding Slides and User Interfaces via Synthetic Data Generation
Yi-Hao Peng, Faria Huq, Yue Jiang, Jason Wu, Amanda Xin Yue Li, Jeffrey Bigham, Amy Pavel
Interactive Speculative Planning: Enhance Agent Efficiency through Co-design of System and User Interface
Wenyue Hua, Mengting Wan, Shashank Vadrevu, Ryan Nadel, Yongfeng Zhang, Chi Wang
Exploring User Acceptance Of Portable Intelligent Personal Assistants: A Hybrid Approach Using PLS-SEM And fsQCA
Gustave Florentin Nkoulou Mvondo, Ben Niu
Non-verbal Interaction and Interface with a Quadruped Robot using Body and Hand Gestures: Design and User Experience Evaluation
Soohyun Shin, Trevor Evetts, Hunter Saylor, Hyunji Kim, Soojin Woo, Wonhwha Rhee, Seong-Woo Kim