This repository contains code to experiment with dlib's recently released global optimizer for neural network hyperparameter optimization.
- dlib: Install dlib by cloning the repository and following the instructions there.
- TF slim: Clone the TF models repository. Add slim to your
PYTHONPATH:export PYTHONPATH=$PYTHONPATH:/path_to_your_folder/models/research/slim
- Python packages: Install all requirements via pip by
pip install -r requirements.txt
Download the binary version of CIFAR-100 from here or run
wget http://www.cs.toronto.edu/~kriz/cifar-100-binary.tar.gzFor running optimization over the three hyperparameters depth_multiplier, weight_decay and dropout_keep_prob with default settings, run
python optimize.py --data_dir <DATA_DIR> --out_dir <OUT_DIR>