HAI 20 notes on meta learning and machine learning.
High level takeaways
- Training and testing on fixed distributions should be changed to distributions on distributions (meta learning)
- Transformer architectures can do relational representations (usually used for natural language but can be extended to object detection)
- Object based representation of relations is useful for learning physics and continuity
- Complicated images take longer for the brain to process and solve
- Intuitive physics and graphs
Modeling things requires the following parameters:
- Architecture
- Task
- Dataset
- Learning rule
Things to look into
- ConvRNN (tradeoff with the number of neurons and the performance)
- Deep constrastive embeddings
- Beta variational auto encoder