Invited Speaker

  1. Youichiro Miyake, Human Agent Interaction in Digital Games
  2. Henny Admoni, Improving Bi-Directional Understanding in Human-AI Interaction
  3. Judith Masthoff, Personalizing Human-AI Interaction
  4. sasakure.UK & UKRampage (有形ランペイジ)

Human Agent Interaction in Digital Games

Youichiro Miyake

SQUARE ENIX Lead AI Researcher
The University of Tokyo

Abstract

The purpose of digital games is to provide users with new experiences,
and various game technologies serve to diversify those experiences.
Over nearly 50 years of digital game history, the game experience has evolved
alongside technological advancements and developments in game design.
In particular, interactions with agents (Non-Player Character) in digital games
are one of the central elements of the game experience,leaving the strongest impact on users’ memory.
This lecture will explain the history and evolution of user-agent interaction
in digital games.

Biography

Youichiro Miyake is a project professor of Institute of Industrial Science (IIS), the University of Tokyo. He has been developing AI of digital games in game industry as AI technical director while researching game AI technologies as lead AI researcher in a digital game company in twenty years. He has developed and designed AI for many digital games. He is also a board member of Digital Games Research Association (DiGRA) JAPAN and a chair of SIG-AI in Japan Chapter, International Game Developers Association (IGDA Japan) from 2006 to resent. He researches the technical design of combination of smart city and metaverse by using game AI system “Meta-Character-Spatial AI dynamic cooperative model”. He published more than 10 books of game AI, and many academic papers. He had many lectures in international academic conferences such as SIGGRAPH, SIGGRAPH ASIA, IEEE Conference On Games, and ACM International Conference on Intelligent Virtual Agents, and international industrial conferences such GDC and CEDEC.


Improving Bi-Directional Understanding in Human-AI Interaction

Henny Admoni

Associate Professor, Robotics Institute
Carnegie Mellon University

Abstract

Intelligent agents hold much promise for improving people’s lives, from driving assistance systems to mental health coaches. Such HAI tasks inherently involve bidirectional interactions, in which humans and agents both try to interpret and respond to their partner’s actions. As AI systems become more complex, their decision making becomes simultaneously more useful and harder to understand. In this talk, I will present several examples from my lab’s research on how people interpret the behavior of AI agents, and how agents can interpret the behavior of people. I’ll describe how we can use cognitive science concepts like theory of mind to improve the communication and explainability of AI systems, making them more useful for assistive and collaborative tasks with people.

Biography

Dr. Henny Admoni is an Associate Professor in the Robotics Institute at Carnegie Mellon University, where she leads the Human And Robot Partners (HARP) Lab. Dr. Admoni’s research interests include human-robot interaction, assistive robotics, and nonverbal communication. Dr. Admoni holds a PhD in Computer Science from Yale University, and a BA/MA joint degree in Computer Science from Wesleyan University.


Personalizing Human-AI Interaction

Judith Masthoff

Utrecht University

Abstract

AI can support humans to make decisions, change behaviour, and improve wellbeing. To optimally do so, AI needs to adapt to humans, personalizing its support, taking into account characteristics of individuals, groups, and context. This talk discusses what kinds of personalization are needed and how personalization can be researched. It provides examples from recommender systems, digital behaviour interventions, and emotional support systems. The ethical aspects of personalization are also touched upon.

Biography

Judith Masthoff is a Professor in Human Centered Computing at Utrecht University. Her research focusses on personalization: how artificial intelligence systems can automatically adapt to humans. She has applied this in many areas including recommender systems, digital behavior interventions, emotional support systems, intelligent tutoring systems, personalized travel advice and e-health. She is Editor in Chief of the User Modeling and User-Adapted Interaction journal and a director of User Modeling Inc., the professional association of user modeling researchers.


sasakure.UK & UKRampage (有形ランペイジ)

Bringing to life music no one has ever heard. To those who have lived, it feels like “something new”; to those who know nothing yet, it becomes “the natural order.” Thus, a “New World Order” of music is formed—.
Using the synthetic voice software “VOCALOID,” represented by anthropmorphized character Hatsune Miku, sasakure.UK has woven songs themed around the relationship between humans and machines. To further expand this musical expression, he gathered super-players active at the cutting edge to form a band. sasakure.UK himself participates as a member, advocating for “tangible” music played with instruments and bodies, transcending the traditional realm of Desktop Music (DTM). Their exceptional performance skills breathe life into sasakure.UK’s complex and intricate compositions, once deemed unplayable. On October 17, 2011, they released their 1st album, ‘UKSekaiReconstruction (Reconstruction of the Tangible World/有形世界リコンストラクション)’, via Pony Canyon. The overwhelming content, reconstructing complex Vocaloid tracks, shocked many listeners. In 2017, they restarted with new members, releasing the albums ‘Aru Katachi (有ル形)’ and ‘ODYSSEY’. In 2024, they announced the album ‘Jiyuritsu (自有律)’, continuing their unique musical expression while pioneering new frontiers.

Important Date

Registration Deadline:

  • Early-bird registration deadline: September 21, 2025 (23:59 AoE)
    (closed)
  • Final registration deadline: November 3, 2025 (23:59 AoE)
    * Banquet guarantee deadline: October 20, 2025 (23:59 AoE)
    (closed)

Conference Dates: November 10-13, 2025

Registration

Please visit the Registration Page for details on fees and how to register.

Call for Papers

The call for papers is now closed. Thank you to all who submitted their work!