I am currently a Human-Computer Researcher. In the past, I was a full-stack developer, and I aspire to become an entrepreneur in the future. My present research interest are malleable programming interface & interaction.
I will pursue my PhD at MIT CSAIL under the supervision of Dr. Arvind Satyanarayan to continue my research in programming interface. I studied master in Computer Science at the University of Waterloo, mentored by Dr. Jian Zhao of the WatVis Lab at UWaterloo. I was guided by Dr. Zhicong Lu from DEER Lab and Dr. Can Liu of the ERFI Lab at CityU HK.
I'm actively looking for a research internship position in the summer of 2025, working on the topic of malleable programming interface.
I was honored to give a talk at Tableau Research on the topic of malleable interaction interface for programming. I shared my research on the design of a system that enables programmers to edit code dynamically.
CoLadder, Memolet and a paper collborated with ChingYi on eliciting walking gestures for AR.
We explored the workflow of collaborative natural language programming and designed a system to support prompt sharing and referring.
UIST24 Poster
We present an initial step towards building a system for programmers to edit code using free-form sketch annotations drawn directly onto editor and output windows. Using a working prototype system as a technical probe, an exploratory study examines how programmers sketch to annotate Python code to communicate edits for an AI model to perform.
UIST24 Paper
A system that assists programmers by enabling hierarchical task decomposition, incremental code generation, and verification of results during prompt authoring. Bridging the abstraction gap between programmers and LLMs.
UIST24 Paper
To support users in recalling and reusing relevant user-AI conversational memories, we introduce Memolet, an interactive object that reifies memory reuse. Users can directly manipulate Memolet to specify which memories to reuse and how to use them. We developed a system demonstrating Memolet's interaction across various memory reuse stages.
UIST24 Paper
We conduct a systematic examination of different kinds of intentional variations from a normal gait that could be used as input actions without interrupting overall walking progress. A design space of 22 candidate Gait Gestures is generated by adapting previous standing foot input actions and identifying new actions possible in a walking context.
IEEE VIS24 Poster
We propose a system that supports contextually aware, controllable, and interactive exploration of academic publications and scholars, enabling bidirectional interaction between question-answering components and Scholets, the 2D projections of scholarly works' embeddings, demonstrated through an exploratory study with graduate researchers.
CHI24 LBW
We have identified three major challenges and proposed three decision-making stages, each with its own relevant factors. Additionally, we present a thorough process model that captures programmers' interaction patterns.
CHI24 Paper
A system to support collaborative prompt engineering by providing referring, requesting, sharing, and linking mechanisms. It assists programmers in comprehending collaborators' prompts and building on their collaborators' work, reducing repetitive updates and communication costs.
CHI23 Paper
A narrative-based viewer participation tool that utilizes a dynamic graphical plot to reflect chatroom negativity. We discovered that StoryChat encouraged viewers to contribute prosocial comments, increased viewer engagement, and fostered viewers' sense of community.
CUI23 Paper
Exploring speech input in HCI, we address editing challenges. Our study combines Cognitive Science with HCI, revealing memory patterns and proposing new interaction concepts for efficient speech editing.
CSCW23 Paper
A real-time, web-based WoZ platform that allows multiple Wizards to collaboratively operate a speech-to-text based system remotely. Our findings reveal the promises and challenges of the multi-Wizard approach and open up new research questions.
June 2021 - Aug 2021
May 2021 - Sep 2021
We have developed a patented sensor for measuring dissolved oxygen in the ocean and used its fast and real-time nature to build AI network, which is dedicated to monitoring water quality in real time and predicting the health of the ocean within three months.
Sep 2020 - May 2021
I am part of a new team at Networld, nearD, a social networking site with a focus on privacy, multi-identity, and locality.
Jan 2023 - June 2024
David R. Cheriton School of Computer Science
Sep 2018 - June 2022
Department of Computer Science
I love to read therotical papers about interface design and HCI design principles.
Reification turns concepts into first class objects, polymorphism permits commands to be applied to objects of different types, and reuse makes both user input and system output accessible for later use.
Demonstrational interfaces, interfaces that let the user perform actions on concrete example objects while constructing an abstract program, thus letting the user create parameterized procedures and objects without learning a programming language, are discussed.
The seven-stage interaction model consists of (1) Establishing the Goal, (2) Forming the Intention, (3) Specifying the Action Sequence, (4) Executing the Action on the System's Interface, (5) Perceiving the System's State as a Response to the Action, (6) Interpreting the State, and (7) Evaluating the System State with respect to the Goals and Iterating until the goal is achieved.
The paper argues for a shift from interface design to interaction design as the means to significantly enhance user interfaces. It calls for the development of powerful interaction models, a better understanding of sensory-motor aspects, and novel interaction architectures addressing key challenges like reinterpretability, resilience, and scalability.
Instrumental Interaction describes graphical user interfaces in terms of domain objects and interaction instruments. Interaction between users and domain objects is mediated by interaction instruments, similar to the tools and instruments we use in the real world to interact with physical objects.
Direct manipulation has been lauded as a good form of interface design, and some interfaces that have this property have been well received by users. This article delves into the cognitive aspects of direct manipulation interfaces, examining both their advantages and disadvantages.
Why do people create extra representations to help them make sense of situations, diagrams, illustrations, instructions and problems? The obvious explanation—external representations save internal memory and computation—is only part of the story.
Layout constraints in a user interface toolkit provide a declarative mechanism for controlling the size and position of objects in an interactive display, along with an efficient update mechanism for maintaining display layouts automatically in the face of dynamic changes.
This paper explore the concept of "interaction" that lack of clear definitions in the field. It identifies various existing concepts, such as interaction as dialogue, transmission, optimal behavior, embodiment, and tool use. These concepts vary in scope and their understanding of the causal relationships between humans and computers.