FlexiBO: Cost-Aware Multi-Objective Optimization of Deep Neural Networks

by   Md Shahriar Iqbal, et al.

One of the key challenges in designing machine learning systems is to determine the right balance amongst several objectives, which also oftentimes are incommensurable and conflicting. For example, when designing deep neural networks (DNNs), one often has to trade-off between multiple objectives, such as accuracy, energy consumption, and inference time. Typically, there is no single configuration that performs equally well for all objectives. Consequently, one is interested in identifying Pareto-optimal designs. Although different multi-objective optimization algorithms have been developed to identify Pareto-optimal configurations, state-of-the-art multi-objective optimization methods do not consider the different evaluation costs attending the objectives under consideration. This is particularly important for optimizing DNNs: the cost arising on account of assessing the accuracy of DNNs is orders of magnitude higher than that of measuring the energy consumption of pre-trained DNNs. We propose FlexiBO, a flexible Bayesian optimization method, to address this issue. We formulate a new acquisition function based on the improvement of the Pareto hyper-volume weighted by the measurement cost of each objective. Our acquisition function selects the next sample and objective that provides maximum information gain per unit of cost. We evaluated FlexiBO on 7 state-of-the-art DNNs for object detection, natural language processing, and speech recognition. Our results indicate that, when compared to other state-of-the-art methods across the 7 architectures we tested, the Pareto front obtained using FlexiBO has, on average, a 28.44 true Pareto front and achieves 25.64


Bayesian Optimization of Multiple Objectives with Different Latencies

Multi-objective Bayesian optimization aims to find the Pareto front of o...

Information-Theoretic Multi-Objective Bayesian Optimization with Continuous Approximations

Many real-world applications involve black-box optimization of multiple ...

Augmented Random Search for Multi-Objective Bayesian Optimization of Neural Networks

Deploying Deep Neural Networks (DNNs) on tiny devices is a common trend ...

Identification of Energy Management Configuration Concepts from a Set of Pareto-optimal Solutions

Optimizing building configurations for an efficient use of energy is inc...

HyperTuner: A Cross-Layer Multi-Objective Hyperparameter Auto-Tuning Framework for Data Analytic Services

Hyper-parameters optimization (HPO) is vital for machine learning models...

PABO: Pseudo Agent-Based Multi-Objective Bayesian Hyperparameter Optimization for Efficient Neural Accelerator Design

The ever increasing computational cost of Deep Neural Networks (DNN) and...

On the Resource Consumption of M2M Random Access: Efficiency and Pareto Optimality

The advent of Machine-to-Machine communication has sparked a new wave of...

Please sign up or login with your details

Forgot password? Click here to reset