m8ta
use https for features.
text: sort by
tags: modified
type: chronology
{1482}
hide / / print
ref: -2019 tags: meta learning feature reuse deepmind date: 10-06-2019 04:14 gmt revision:1 [0] [head]

Rapid learning or feature reuse? Towards understanding the effectiveness of MAML

  • It's feature re-use!
  • Show this by freezing the weights of a 5-layer convolutional network when training on Mini-imagenet, either 5shot 1 way, or 5shot 5 way.
  • From this derive ANIL, where only the last network layer is updated in task-specific training.
  • Show that ANIL works for basic RL learning tasks.
  • This means that roughly the network does not benefit much from join encoding -- encoding both the task at hand and the feature set. Features can be learned independently from the task (at least these tasks), with little loss.