Skip to content

stecklin/hyperparameter-optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hyperparameter optimization for neural networks

This repository contains code to experiment with dlib's recently released global optimizer for neural network hyperparameter optimization.

Prerequisites

  • dlib: Install dlib by cloning the repository and following the instructions there.
  • TF slim: Clone the TF models repository. Add slim to your PYTHONPATH:
    export PYTHONPATH=$PYTHONPATH:/path_to_your_folder/models/research/slim
  • Python packages: Install all requirements via pip by
    pip install -r requirements.txt

Download the binary version of CIFAR-100 from here or run

wget http://www.cs.toronto.edu/~kriz/cifar-100-binary.tar.gz

Usage

For running optimization over the three hyperparameters depth_multiplier, weight_decay and dropout_keep_prob with default settings, run

python optimize.py --data_dir <DATA_DIR> --out_dir <OUT_DIR>

About

Experimenting with dlib's global optimizer for finding good neural network hyperparameters

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages