[IROS20] Graph-based Hierarchical Knowledge Representation for Robot Task Transfer from Virtual to Physical World

Overview of the proposed framework for learning abstract knowledge for robot task transfer. (a) Using the Oculus headset and Touch controller, (b) a subject can demonstrate a sequence for the task of clothes folding in a physically realistic VR environment. Our algorithm is able to (c) induce a hierarchical graph-based knowledge representation based on human demonstrations, and (d) transfer it to a physical Baxter robot for execution by minimizing the entropy.

Abstract

We study the hierarchical knowledge transfer problem using a cloth-folding task, wherein the agent is first given a set of human demonstrations in the virtual world using an Oculus Headset, and later transferred and validated on a physical Baxter robot. We argue that such an intricate robot task transfer across different embodiments is only realizable if an abstract and hierarchical knowledge representation is formed to facilitate the process, in contrast to prior literature of sim2real in a reinforcement learning setting. Specifically, the knowledge in both the virtual and physical worlds are measured by information entropy built on top of a graph-based representation, so that the problem of task transfer becomes the minimization of the relative entropy between the two worlds. An And-Or-Graph (AOG) is introduced to represent the knowledge, induced from the human demonstrations performed across six virtual scenarios inside the Virtual Reality (VR). During the transfer, the success of a physical Baxter robot platform across all six tasks demonstrates the efficacy of the graph-based hierarchical knowledge representation.

Publication
In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems

Related