HAI 20 Meta Learning

Thursday, October 15, 2020

HAI 20 notes on meta learning and machine learning.

High level takeaways

  • Training and testing on fixed distributions should be changed to distributions on distributions (meta learning)
  • Transformer architectures can do relational representations (usually used for natural language but can be extended to object detection)
  • Object based representation of relations is useful for learning physics and continuity
  • Complicated images take longer for the brain to process and solve
  • Intuitive physics and graphs

Modeling things requires the following parameters:

  1. Architecture
  2. Task
  3. Dataset
  4. Learning rule

Things to look into

  • ConvRNN (tradeoff with the number of neurons and the performance)
  • Deep constrastive embeddings
  • Beta variational auto encoder
researchneurosciencepsychologymachinelearning

HAI 20 HRI Learning

Depression, Anxiety, Loneliness, BU