Skip to content

RedisAI Commands

AI.CONFIG LOADBACKEND

AI.CONFIG LOADBACKEND Example

Load a DL/ML backend.

By default, RedisAI starts with the ability to set and get tensor data, but setting and running models and scritps requires a computing backend to be loaded. This command allows to dynamically load a backend by specifying the backend identifier and the path to the backend library. Currently, once loaded, a backend cannot be unloaded, and there can be at most one backend per identifier loaded.

AI.CONFIG LOADBACKEND <backend_identifier> <location_of_backend_library>
  • allowed backend identifiers are: TF (TensorFlow), TORCH (PyTorch), ONNX (ONNXRuntime).

It is possible to specify backends at the command-line when starting redis-server, see example below.

Load the TORCH backend

AI.CONFIG LOADBACKEND TORCH install/backend/redisai_torch/redisai_torch.so

Load the TORCH backend at the command-line

redis-server --loadmodule install/redisai.so TORCH install/backend/redisai_torch/redisai_torch.so

This replaces the need for loading a backend using AI.CONFIG LOADBACKEND

AI.TENSORSET

Set a tensor.

Stores a tensor of defined type with shape given by shape1..shapeN.

AI.TENSORSET tensor_key data_type shape1 shape2 ... [BLOB data | VALUES val1 val2 ...]
  • tensor_key - Key for storing the tensor
  • data_type - Numeric data type of tensor elements, one of FLOAT, DOUBLE, INT8, INT16, INT32, INT64, UINT8, UINT16
  • shape - Shape of the tensor, that is how many elements for each axis

Optional args:

  • BLOB data - provide tensor content as a binary buffer
  • VALUES val1 val2 - provide tensor content as individual values

If no BLOB or VALUES are specified, the tensor is allocated but not initialized to any value.

TENSORSET Example

Set a 2x2 tensor at foo 1 2 3 4

AI.TENSORSET foo FLOAT 2 2 VALUES 1 2 3 4

AI.TENSORGET

Get a tensor.

AI.TENSORGET tensor_key [BLOB | VALUES | META]
  • tensor_key - Key for the tensor
  • BLOB - Return tensor content as a binary buffer
  • VALUES - Return tensor content as a list of values
  • META - Only return tensor meta data (datat type and shape)

TENSORGET Example

Get binary data for tensor at foo. Meta data is also returned.

AI.TENSORGET foo BLOB

AI.MODELSET

Set a model.

AI.MODELSET model_key backend device [INPUTS name1 name2 ... OUTPUTS name1 name2 ...] model_blob
  • model_key - Key for storing the model
  • backend - The backend corresponding to the model being set. Allowed values: TF, TORCH, ONNX.
  • device - Device where the model is loaded and where the computation will run. Allowed values: CPU, GPU.
  • INPUTS name1 name2 ... - Name of the nodes in the provided graph corresponding to inputs [TF backend only]
  • OUTPUTS name1 name2 ... - Name of the nodes in the provided graph corresponding to outputs [TF backend only]
  • model_blob - Binary buffer containing the model protobuf saved from a supported backend

MODELSET Example

AI.MODELSET resnet18 TORCH GPU < foo.pt
AI.MODELSET resnet18 TF CPU INPUTS in1 OUTPUTS linear4 < foo.pt
AI.MODELSET mnist_net ONNX CPU < mnist.onnx

AI.MODELGET

Get a model.

AI.MODELGET model_key
  • model_key - Key for the model

The command returns the model as serialized by the backend, that is a string containing a protobuf.

AI.MODELDEL

Removes a model at a specified key.

AI.MODELDEL model_key
  • model_key - Key for the model

Currently, the command is fully equivalent to calling DEL on model_key.

AI.MODELRUN

Run a model.

AI.MODELRUN model_key INPUTS input_key1 ... OUTPUTS output_key1 ...
  • model_key - Key for the model
  • INPUTS input_key1 ... - Keys for tensors to use as inputs
  • OUTPUTS output_key2 ... - Keys for storing output tensors

The request is queued and evaded asynchronously from a separate thread. The client blocks until the computation finishes.

If needed, input tensors are copied to the device specified in AI.MODELSET before execution.

MODELRUN Example

AI.MODELRUN resnet18 INPUTS image12 OUTPUTS label12

AI.SCRIPTSET

Set a script.

AI.SCRIPTSET script_key device script_source
  • script_key - Key for storing the script
  • device - The device where the script will execute
  • script_source - A string containing TorchScript source code

SCRIPTSET Example

Given addtwo.txt as:

def addtwo(a, b):
    return a + b
AI.SCRIPTSET addscript GPU < addtwo.txt

AI.SCRIPTGET

Get a script.

AI.SCRIPTGET script_key
  • script_key - key for the script

AI.SCRIPTDEL

Removes a script at a specified key.

AI.SCRIPTDEL script_key
  • script_key - key for the script

Currently, the command is fully equivalent to calling DEL on script_key.

SCRIPTGET Example

AI.SCRIPTGET addscript

AI.SCRIPTRUN

Run a script.

AI.SCRIPTRUN script_key fn_name INPUTS input_key1 ... OUTPUTS output_key1 ...
  • tensor_key - Key for the script
  • fn_name - Name of the function to execute
  • INPUTS input_key1 ... - Keys for tensors to use as inputs
  • OUTPUTS output_key1 ... - Keys for storing output tensors

If needed, input tensors are copied to the device specified in AI.SCRIPTSET before execution.

SCRIPTRUN Example

AI.SCRIPTRUN addscript addtwo INPUTS a b OUTPUTS c