Machine learning implemented conspicuous progress over the past decade in diverse areas such as image analysis, computer vision, natural language processing. It is playing a crucial role in harnessing the power of bulky amounts of data which is produced every day in our digital world.
The process of building a high-quality machine learning model is an iterative, complex, and time-consuming process that involves trying different algorithms and techniques in order to have a great experience with effectively tuning their hyperparameters.
With the continuous and massive increase of the amount of data in our world there is a vital need for automating the process of building good machine learning models.
There are various techniques and platforms introduced in order to tackle the challenges of automating the process of combined algorithm selection and hyperparameters in the domain.
Metalearning is the process of learning from the previous experience gained by applying the various learning algorithms on divergent kinds of data in order to reduce the needed time to learn new tasks.
In order to optimize the risks, the techniques are categorized into groups such as learning based on task properties, learning from previous model evaluation, and already pre-trained models.
Another direction for architecture search is evolutionary algorithms which are well suited for optimizing arbitrary structure. LEAF is an evolutionary AutoML framework that optimizes hyper-parameters, network architecture, and the size of the network.
It uses the CoDeepNEAT which is a powerful evolutionary algorithm based on the required framework. The algorithm achieved state-of-the-art performance results on image classification and natural language analysis.
For supervised learning tasks, evolutionary-based approaches turn to outperform reinforcement learning approaches specifically when the neural network architecture is very complex due to having millions of parameters for it to be tuned.
The model pipeline algorithm with the highest capability for achieving the top performance on the input dataset is used whereas the next step is tuning the hyper-parameters of the model in order to further optimize the model performance.
It is worth acknowledging that few tools have democratized the space of different learning algorithms in discrete numbers of model pipelines. So, the model selection is considered as a categorical parameter that needs to be tuned in the first place before modifying its hyper-parameters.
In general, several hyper-parameter optimization techniques have been based on the borrowed ideas from the domains of statistical model selection and traditional optimization techniques.
The automated hyper-parameter tuning techniques are classified into two main categories such as black-box optimization techniques and multi-fidelity optimization techniques.
Several cloud-based solutions have been known to tackle the automated machine learning problem using the availability of high computational power on cloud environments to try a wide range of models and configurations.
Google AutoML has been introduced as a block of the artificial intelligence platform services which is supported by Google cloud. It supports training a wide range of machine learning models in different domains with minimal user experience where the models can be trained for various tasks including sight, language, and structured data.
AutoML vision, and video intelligence are used in getting intuition from visual data like object localization, detection and classification for the static images, and video streams through the pretrained models in order to train the custom models on user data.
Similarly, AutoML Natural language, and its translation provides the user with APIs for automatic language detection, and transition in addition to penetrate the text analysis like sentiment classification, and entity extraction.
These language services accept ten distinct languages. Alongside autoML tables support the training of high-quality models on tabular structures by automating feature engineering, model selection, and hyperparameter tuning.