|
| 1 | +# Configuration |
| 2 | + |
| 3 | +RedisAI supports both run-time configuration options and others that should be specified when loading the module. |
| 4 | + |
| 5 | +## Configuration Options During Loading |
| 6 | + |
| 7 | +In general, passing configuration options is done by appending arguments after the `--loadmodule` argument in the command line, `loadmodule` configuration directive in a Redis config file, or the `MODULE LOAD` command. |
| 8 | + |
| 9 | +The module dynamic library `redisai.so` can be located in any path, provided that we specify the full path or a path relative to where the `redis-server` command is issued. The additional arguments are options passed to the module. Currently the supported options are: |
| 10 | + |
| 11 | +- `BACKENDSPATH`: specify the default backends path used when loading a dynamic backend library. |
| 12 | +- `TORCH`: specify the location of the PyTorch backend library, and dynamically load it. The location can be given in two ways, absolute or relative to the `<BACKENDSPATH>`. Using this option replaces the need for loading the PyTorch backend on runtime. |
| 13 | +- `TF`: specify the location of the TensorFlow backend library, and dynamically load it. The location can be given in two ways, absolute or relative to the `<BACKENDSPATH>`. Using this option replaces the need for loading the TensorFlow backend on runtime. |
| 14 | +- `TFLITE`: specify the location of the TensorFlow Lite backend library, and dynamically load it. The location can be given in two ways, absolute or relative to the `<BACKENDSPATH>`. Using this option replaces the need for loading the TensorFlow Lite backend on runtime. |
| 15 | +- `ONNX`: specify the location of the ONNXRuntime backend library, and dynamically load it. The location can be given in two ways, absolute or relative to the `<BACKENDSPATH>`. Using this option replaces the need for loading the ONNXRuntime backend on runtime. |
| 16 | +- `THREADS_PER_QUEUE`: specify the fixed number of worker threads up front per device. This option is described in detail at [THREADS_PER_QUEUE](##THREADS_PER_QUEUE) section and can be only set when loading the module. |
| 17 | + |
| 18 | + |
| 19 | +### Configuration Examples |
| 20 | + |
| 21 | +In redis.conf: |
| 22 | + |
| 23 | +``` |
| 24 | +loadmodule redisai.so OPT1 OPT2 |
| 25 | +``` |
| 26 | + |
| 27 | +From redis-cli: |
| 28 | + |
| 29 | +``` |
| 30 | +127.0.0.6379> MODULE load redisai.so OPT1 OPT2 |
| 31 | +``` |
| 32 | + |
| 33 | +From command line using relative path: |
| 34 | + |
| 35 | +``` |
| 36 | +$ redis-server --loadmodule ./redisai.so OPT1 OPT2 |
| 37 | +``` |
| 38 | + |
| 39 | +From command line using full path: |
| 40 | + |
| 41 | +``` |
| 42 | +$ redis-server --loadmodule /usr/lib/redis/modules/redisai.so OPT1 OPT2 |
| 43 | +``` |
| 44 | + |
| 45 | + |
| 46 | +### THREADS_PER_QUEUE |
| 47 | + |
| 48 | +``` |
| 49 | +THREADS_PER_QUEUE {number} |
| 50 | +``` |
| 51 | +Enable configuring the main thread to create a fixed number of worker threads up front per device. This controls the maximum number of threads to be used for parallel execution of independent different operations. |
| 52 | + |
| 53 | +This option can significantly improve the model run performance for simple models (models that require low computation effort), since there is usually room for extra computation on modern CPU's and hardware accelerators (GPUs, TPUs, etc.). |
| 54 | + |
| 55 | +#### THREADS_PER_QUEUE Default |
| 56 | + |
| 57 | +By default only one worker thread is used per device. |
| 58 | + |
| 59 | +#### THREADS_PER_QUEUE Example |
| 60 | + |
| 61 | +``` |
| 62 | +$ redis-server --loadmodule ./redisai.so THREADS_PER_QUEUE 4 |
| 63 | +``` |
| 64 | + |
| 65 | +--- |
| 66 | + |
| 67 | + |
| 68 | +## Setting Configuration Options In Run-Time |
| 69 | + |
| 70 | +### AI.CONFIG BACKENDSPATH |
| 71 | + |
| 72 | +Specify the default backends path to use when dynamically loading a backend. |
| 73 | + |
| 74 | +```sql |
| 75 | +AI.CONFIG BACKENDSPATH <default_location_of_backend_libraries> |
| 76 | +``` |
| 77 | + |
| 78 | +#### AI.CONFIG BACKENDSPATH Example |
| 79 | + |
| 80 | + |
| 81 | +```sql |
| 82 | +AI.CONFIG BACKENDSPATH /usr/lib/redis/modules/redisai/backends |
| 83 | +``` |
| 84 | + |
| 85 | +### AI.CONFIG LOADBACKEND |
| 86 | + |
| 87 | +Load a DL/ML backend. |
| 88 | + |
| 89 | +```sql |
| 90 | +AI.CONFIG LOADBACKEND <backend_identifier> <location_of_backend_library> |
| 91 | +``` |
| 92 | + |
| 93 | +RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. |
| 94 | + |
| 95 | +Allowed backend identifiers are: |
| 96 | +- `TF` (TensorFlow) |
| 97 | +- `TFLITE` (TensorFlow Lite) |
| 98 | +- `TORCH` (PyTorch) |
| 99 | +- `ONNX` (ONNXRuntime) |
| 100 | + |
| 101 | + |
| 102 | + |
| 103 | +By default, RedisAI starts with the ability to set and get tensor data, but setting and running models and scritps requires a computing backend to be loaded, which can be done during loading, as [explained above](##-Configuration-Options-During-Loading), or at or run-time using the `AI.CONFIG` commmand. |
| 104 | + |
| 105 | +This command allows to dynamically load a backend by specifying the backend identifier and the path to the backend library. Currently, once loaded, a backend cannot be unloaded, and there can be at most one backend per identifier loaded. |
| 106 | + |
| 107 | + |
| 108 | +If you don't specify a backend on load time, RedisAI will look into the default location lazily, when a model of a given backend is loaded. |
| 109 | + |
| 110 | +The default location relative to the `<BACKENDSPATH>` directory. If unspecified, by default RedisAI will look for: |
| 111 | +- ONNXRuntime dynamic library at: `<BACKENDSPATH>/redisai_onnxruntime/redisai_onnxruntime.so` |
| 112 | +- TensorFlow dynamic library at: `<BACKENDSPATH>/redisai_tensorflow/redisai_tensorflow.so` |
| 113 | +- TensorFlow Lite dynamic library at: `<BACKENDSPATH>/redisai_tflite/redisai_tflite.so` |
| 114 | +- PyTorch dynamic library at: `<BACKENDSPATH>/redisai_torch/redisai_torch.so` |
| 115 | + |
| 116 | +Any library dependency will be resolved automatically, and the mentioned directories are portable on all platforms. |
| 117 | + |
| 118 | +If relative, it is relative to `<BACKENDSPATH>`. |
| 119 | + |
| 120 | + |
| 121 | +#### AI.CONFIG LOADBACKEND Examples |
| 122 | + |
| 123 | + Load the TORCH backend, relative to `BACKENDSPATH` |
| 124 | + |
| 125 | +```sql |
| 126 | +AI.CONFIG LOADBACKEND TORCH redisai_torch/redisai_torch.so |
| 127 | +``` |
| 128 | + |
| 129 | + Load the TORCH backend, specifying full path |
| 130 | + |
| 131 | + |
| 132 | +```sql |
| 133 | +AI.CONFIG LOADBACKEND TORCH /usr/lib/redis/modules/redisai/backends/redisai_torch/redisai_torch.so |
| 134 | +``` |
0 commit comments