One-Shot Imitation Learning | Yan Duan et al. | 2017

Summary


Abstract

  • Ideally, robots should be able to learn from very few demonstrations of any given task, and instantly generalize to new situations of the same task, without requiring task-specific engineering.
  • In this paper, we propose a meta-learning framework for achieving such capability, which we call one-shot imitation learning.
  • Task examples:
    • to stack all blocks on a table into a single tower
    • to place all blocks on a table into two-block towers
  • A neural net is trained
    • such that
    • when it takes as input
      • the first demonstration demonstration and
      • a state sampled from the second demonstration,
    • it should predict the action corresponding to the sampled state.
  • Our experiments show that the use of soft attention allows the model to generalize to conditions and tasks unseen in the training data

Leave a Reply

Your email address will not be published. Required fields are marked *