2023 Fall

Project 1: Intuitive Physics

Humans exhibit a remarkable capability for intuitive physics even without formal physics training: grounding physical concepts, predicting physical dynamics, and making physical judgments. Despite the crucial role intuitive physics plays in our lives, the underlying computational mechanism remains unclear. There has been a long debate in intuitive physics about whether humans use simulation (unfolding future states) or heuristic (quick shotcuts) systems to understand, predict, and act in physical scenarios. Do humans use mental simulation all the time? If not, what are the other models people use as a heuristic alternative that leads to different behaviors? When will these two systems switch?

In order to explore the mechanism of these two cognitive systems, we released two experiments focusing on daily phyiscal events. If you are interested, please click this link to get involved. We will contact you later after you submit this form. Should you have any questions regarding the experiments, please reach out to Yuki Ma (Email: [email protected]).

Qualified completion will receive 2 course credits.

Project 2: Nonverbal Communication

Subtlest motion and gestures can carry profound meaning. They express ideas, emotions, and messages. Picture this: a world where every subtle human gesture is not just seen but understood by machines, a world where AI can discern and produce the intricate nuances of our movements. It can enhance the precision of robotics, create stunning animations, and refine dynamic visuals. To embark on this captivating journey, we’re in the process of amassing a comprehensive video dataset dissecting the intricacies of human motion. Your participation in this endeavor is an invitation to unveil the finest details of our actions. Click this link and be a part of reshaping the future of motion and expression.

Should you have any questions regarding the experiments, please reach out to Shuwen Qiu (Email: [email protected]).

Qualified completion will receive 2 course credits.

Project 3: Does ChatGPT Possess Social Intelligence?

Large language models (LLMs), such as ChatGPT, have demonstrated impressive performance across a variety of text-related tasks. Some researchers have even suggested that GPT-4 has sparked a form of general intelligence, acquiring human-like cognitive abilities like the theory of mind. However, given that LLMs are trained on vast amounts of data, some of which remain private and inaccessible to the public, it raises questions about whether the existing tests for LLMs may already include pre-existing queries. Therefore, it’s crucial to develop novel tasks to accurately assess the true capabilities of LLMs.

As human beings, we are capable of reading minds, that is, inferring the hidden mental states (e.g., beliefs, goals, intentions, desires) from others’ actions. In addition, we can also “tell” our intentions to others not only through verbal communication but also via actions. These abilities allow us to function smoothly and efficiently in social interactions. Do LLMs have the ability to “read” minds? Can LLMs “tell” others about their mental states?

We invite you to participate in two novel experiments related to social intelligence. You will engage in role-playing exercises in real-world scenarios, striving to interpret others’ intentions and convey your own intentions to others.

Please click the link to get enrolled. Once you have submitted the form, you will be contacted later for the in-person experiments. Should you have any questions regarding the experiments, please reach out to Yuki Ma (Email: [email protected]).

Qualified completion will receive 2 course credits.