Skip to content

Commit c2e9f10

Browse files
K-Jolantiga
authored andcommitted
First page documentation (#77)
* First page documentation * aligning readme and docs * small fixes
1 parent 8d323ef commit c2e9f10

File tree

1 file changed

+74
-0
lines changed

1 file changed

+74
-0
lines changed

docs/index.md

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,75 @@
11
# RedisAI Module
2+
3+
RedisAI is a Redis module for serving tensors and executing deep learning models.
4+
5+
## Quickstart
6+
7+
1. [Docker](#docker)
8+
2. [Build](#building)
9+
3. [Start](#start)
10+
11+
## Docker
12+
13+
To quickly tryout RedisAI, launch an instance using Docker:
14+
15+
```sh
16+
docker run -p 6379:6379 -it --rm redisai/redisai
17+
```
18+
19+
## Building
20+
This will checkout and build and download the libraries for the backends
21+
(TensorFlow and PyTorch) for your platform.
22+
23+
```sh
24+
bash get_deps.sh
25+
26+
```
27+
28+
Once the dependencies are downloaded, build the module itself. Note that
29+
CMake 3.0 or higher is required.
30+
31+
```sh
32+
mkdir build
33+
cd build
34+
cmake -DDEPS_PATH=../deps/install ..
35+
make
36+
cd ..
37+
```
38+
39+
## Start
40+
You will need a redis-server version 4.0.9 or greater. This should be
41+
available in most recent distributions:
42+
43+
```sh
44+
redis-server --version
45+
Redis server v=4.0.9 sha=00000000:0 malloc=libc bits=64 build=c49f4faf7c3c647a
46+
```
47+
48+
To start redis with the RedisAI module loaded, you need to make sure the dependencies can be found by redis. One example on how to do this on Linux is:
49+
50+
```
51+
LD_LIBRARY_PATH=<PATH_TO>/deps/install/lib redis-server --loadmodule build/redisai.so
52+
```
53+
54+
If you want to run examples, make sure you have [git-lfs](https://git-lfs.github.com) installed when you clone.
55+
On the client, load the model
56+
```
57+
./deps/redis/src/redis-cli -x AI.MODELSET foo TF CPU INPUTS a b OUTPUTS c < graph.pb
58+
```
59+
60+
Then create the input tensors, run the computation graph and get the output tensor (see `load_model.sh`). Note the signatures:
61+
* `AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN]`
62+
* `AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ...`
63+
```
64+
redis-cli
65+
> AI.TENSORSET bar FLOAT 2 VALUES 2 3
66+
> AI.TENSORSET baz FLOAT 2 VALUES 2 3
67+
> AI.MODELRUN foo INPUTS bar baz OUTPUTS jez
68+
> AI.TENSORGET jez VALUES
69+
1) FLOAT
70+
2) 1) (integer) 2
71+
3) 1) "4"
72+
2) "9"
73+
```
74+
75+
Full documentation of the api can be found [here](commands.md).

0 commit comments

Comments
 (0)