The hyperparameters of a machine learning model are the parameters used to control the learning process. They differ from one type of model to the other (e.g. number of hidden layers, number of neurons, ... for a Neural Network, kernels for a Gaussian Process, ...).
Manual tuning
If you are already familiar with the effects of the hyperparameters and want to tune your model manually, you can define the parameters yourself by clicking on the advanced options while setting up a model. However, this is quite time consuming (it requires re-running the model each time), and requires a good knowledge of how the models are trained.
Automated hyperparameter optimisation
Another option is to let the platform try multiple models and find the best one. To do so, go in advanced options of a model and select hyperparameter optimisation. The figure below shows what is displayed for a Neural Network.
For each hyperparameter, you can select multiple values to explore. You will then have to choose between two search methods:
- Exhaustive search: This method will try models with all possible combinations of values. In the example below, that would be 4 x 3 x 4 x 2 = 96 models to train and evaluate. This method should be used if you have a limited number of combinations, if models can be trained quickly, or if the overall training length is not an issue.
- Randomised search: This method will randomely select a user-defined number of models to compare (10 in the figure below). This method can be used when trying all combinations would take too long.
Hyperparameters are hidden in the advanced options section. Read this article for a full description of the advanced options including cross validation and hyperparameter optimisation.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article