Home > Net >  Hyperparameter Tuning of Tensorflow Model | Hidden Layer size and number of hidden layers
Hyperparameter Tuning of Tensorflow Model | Hidden Layer size and number of hidden layers

Time:12-05

I need to tune the number of hidden layers and their hidden size of a regression model.

As I tested before, generic hyperparameter optimization algorithms (grid search and random search) are not enough due to a large number of hyperparameters. Could I use PBT or Bayesian optimization to tune the network structure? In general, is there any optimization methods for tuning the hidden layer size or number of hidden layers except grid search and random search?

CodePudding user response:

If you're using PyTorch, the easiest way to do this is Auto-PyTorch -- it takes care of finding the best neural architecture for you (within the specified budget) and is more efficient than random search.

There's a lot more information on its website, including pointers to papers describing the implemented methodology and benchmark comparisons to other approaches.

CodePudding user response:

You might want to give a try to Hyperband tuner. As an example, please see a section "Fine-Tuning Neural Network Hyperparameters" in https://github.com/ageron/handson-ml3/blob/main/10_neural_nets_with_keras.ipynb

It is described in "Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow (3rd edition)" (p.347).

  • Related