Hybrid Algorithm Selection and Hyperparameter Tuning on Distributed Machine Learning Resources: A Hierarchical Agent-based Approach

by   Ahmad Esmaeili, et al.

Algorithm selection and hyperparameter tuning are critical steps in both academic and applied machine learning. On the other hand, these steps are becoming ever increasingly delicate due to the extensive rise in the number, diversity, and distributedness of machine learning resources. Multi-agent systems, when applied to the design of machine learning platforms, bring about several distinctive characteristics such as scalability, flexibility, and robustness, just to name a few. This paper proposes a fully automatic and collaborative agent-based mechanism for selecting distributedly organized machine learning algorithms and simultaneously tuning their hyperparameters. Our method builds upon an existing agent-based hierarchical machine-learning platform and augments its query structure to support the aforementioned functionalities without being limited to specific learning, selection, and tuning mechanisms. We have conducted theoretical assessments, formal verification, and analytical study to demonstrate the correctness, resource utilization, and computational efficiency of our technique. According to the results, our solution is totally correct and exhibits linear time and space complexity in relation to the size of available resources. To provide concrete examples of how the proposed methodologies can effectively adapt and perform across a range of algorithmic options and datasets, we have also conducted a series of experiments using a system comprised of 24 algorithms and 9 datasets.


page 1

page 2

page 3

page 4


HAMLET: A Hierarchical Agent-based Machine Learning Platform

Hierarchical Multi-Agent Systems provide a convenient and relevant way t...

Hierarchical Collaborative Hyper-parameter Tuning

Hyper-parameter Tuning is among the most critical stages in building mac...

Agent-based Collaborative Random Search for Hyper-parameter Tuning and Global Function Optimization

Hyper-parameter optimization is one of the most tedious yet crucial step...

JITuNE: Just-In-Time Hyperparameter Tuning for Network Embedding Algorithms

Network embedding (NE) can generate succinct node representations for ma...

On the Performance of Differential Evolution for Hyperparameter Tuning

Automated hyperparameter tuning aspires to facilitate the application of...

Applying Autonomous Hybrid Agent-based Computing to Difficult Optimization Problems

Evolutionary multi-agent systems (EMASs) are very good at dealing with d...

OBOE: Collaborative Filtering for AutoML Initialization

Algorithm selection and hyperparameter tuning remain two of the most cha...

Please sign up or login with your details

Forgot password? Click here to reset