Local Search is State of the Art for NAS Benchmarks
Local search is one of the simplest families of algorithms in combinatorial optimization, yet it yields strong approximation guarantees for canonical NP-Complete problems such as the traveling salesman problem and vertex cover. While it is a ubiquitous algorithm in theoretical computer science, local search has been widely neglected in hyperparameter optimization, and has never been used to perform neural architecture search (NAS). We show that the simplest local search instantiations achieve state-of-the-art results on the most popular existing NAS benchmarks (NASBench-101 and NASBench-201). For example, on CIFAR-100 with the NASBench-201 search space, local search reaches the global optimum after training just 127 architectures on average, outperforming many popular NAS algorithms. However, local search fails to perform well on the much larger DARTS search space. We present a thorough theoretical and empirical study, explaining the success of local search on smaller, structured search spaces.
READ FULL TEXT