I consider myself as a connectionist, and I am fascinated with understanding natural intelligence by creating something similar, through computational modeling with neural networks. My research interests center on Embodied AI, Multimodality, and Natural Language Processing, with the goal of creating scalable cognitive agents that perceive, act, and learn continuously and sample-efficiently from experiences, similar to humans in both physical and virtual worlds.
If you're a JHU student interested in topics like memory, continual learning, skill acquisition, spatial intelligence, world models, representation learning, developmental psychology, behavioral emergence, or consciousness, and you'd like to work with me, feel free to drop me an email to set up a chat.
[2025.06] We released Virtual Community — a simulator where humans and robots cohabit in dynamic, open societies with unlimited characters, robots, and cities!
[2025.01] COMFORT accepted to ICLR 2025 as Oral Presentation.
Virtual Community: An Open World for Humans, Robots, and Society
Qinhong Zhou*, Hongxin Zhang*, Xiangye Lin*, Zheyuan Zhang*, Yutian Chen, Wenjun Liu, Zunzhe Zhang, Sunli Chen, Lixing Fang, Qiushi Lyu, Xinyu Sun, Jincheng Yang, Zeyuan Wang, Bao Chi Dang, Zhehuan Chen, Daksha Ladia, Jiageng Liu, Chuang Gan
International Conference on Learning Representations (ICLR), 2026
Project Page
|
Paper
|
Code
Explainable Procedural Mistake Detection
Shane Storks, Itamar Bar-Yossef, Yayuan Li, Zheyuan Zhang, Jason J. Corso, Joyce Chai
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2025
Paper
|
Code