Neural architecture search (NAS) is a technique for automatically finding the best neural network architecture for a given task. This can be done by searching over a large space of possible architectures, or by using reinforcement learning to learn the best architecture directly from the data.
- MIT HAN Lab Lecture on Neural Architecture Search (I, II, III)
- AutoML and Literature on NAS
- Awesome AutoDL
- Awesome Architecture Search