Abstract
Neural architecture search is a technique for automating the design of artificial neural networks, a widely used model in the field of machine learning. NAS has been used to design networks that are on par or outperform hand designed architectures. As manually designing the architectures is quite laborious and challenging to execute without adequate experience, NAS enables discovering novel, state-of-the-art architectures. Nonetheless, successfully implementing NAS processes also requires extensive experience with both neural networks and optimization processes. Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used. Neural Operations Research and Development decouples implementing and designing the networks, enabling the application of existing methods on novel datasets and fairly comparing results. Thus, it aims to make NAS more accessible to researchers, as well as industry practitioners.