Yixin Zhu

Yixin Zhu

Assistant Professor

PKU Institute for AI


Dr. Yixin Zhu received a Ph.D. degree (‘18) from UCLA advised by Prof. Song-Chun Zhu. His research builds interactive AI by integrating high-level common sense (functionality, affordance, physics, causality, intent) with raw sensory inputs (pixels and haptic signals) to enable richer representation and cognitive reasoning on objects, scenes, shapes, numbers, and agents. Dr. Zhu directs the PKU CoRe Lab, working on abstract reasoning, visually grounded reasoning, and interactive reasoning.

[CV] [Dissertation]

Getting Started 帮你自学大学四年AI的所有内容
通班相关文章 汇集官方和权威渠道对通班的报道
博雅一号(成都) 目前仅限AI院全职PI和北大通班学生使用


[arXiv22] Neural-Symbolic Recursive Machine for Systematic Generalization
[arXiv22] EST: Evaluating Scientific Thinking in Artificial Agents
[arXiv22] Evaluating and Inducing Personality in Pre-trained Language Models
[arXiv22] PartAfford: Part-level Affordance Discovery from 3D Objects
[arXiv21] HALMA: Humanlike Abstraction Learning Meets Affordance in Rapid Problem Solving


[ICCV23] Full-Body Articulated Human-Object Interaction
[ICCV23] X-VoE: Measuring eXplanatory Violation of Expectation in Physical Events
[IROS23] Learning a Causal Transition Model for Object Cutting
[IROS23] Part-level Scene Reconstruction Affords Robot Interaction
[IROS23] Sequential Manipulation Planning for Over-actuated Unmanned Aerial Manipulators
[ACL23Demo] PersLEARN: Research Training through the Lens of Perspective Cultivation
[ICML23] MEWL: Few-shot multimodal word learning with referential uncertainty
[ICML23] On the Complexity of Bayesian Generalization
[CVPR23] Diffusion-based Generation, Optimization, and Planning in 3D Scenes
[ICRA23] GenDexGrasp: Generalizable Dexterous Grasping
[ICRA23] Rearrange Indoor Scenes for Human-Robot Co-Activity
[Engineering23] A Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps