From 8b4b654a6737d2820c260236829b713d77983b36 Mon Sep 17 00:00:00 2001 From: Luca Antiga Date: Thu, 30 Apr 2020 18:20:33 +0200 Subject: [PATCH] A few tweaks to the docs --- docs/commands.md | 24 ++++++++++++------------ docs/configuration.md | 2 ++ docs/developer.md | 6 ++---- docs/images/graph.pb.png | Bin 0 -> 8080 bytes docs/index.md | 5 +++-- docs/intro.md | 30 ++++++++++++++++++------------ 6 files changed, 37 insertions(+), 30 deletions(-) create mode 100644 docs/images/graph.pb.png diff --git a/docs/commands.md b/docs/commands.md index 1b6db8ccc..66edf5230 100644 --- a/docs/commands.md +++ b/docs/commands.md @@ -1,7 +1,7 @@ # RedisAI Commands RedisAI is a Redis module, and as such it implements several data types and the respective commands to use them. -All of RedisAI's commands are begin with the `AI.` prefix. The following sections describe these commands. +All of RedisAI's commands begin with the `AI.` prefix. The following sections describe these commands. **Syntax Conventions** @@ -365,7 +365,7 @@ def addtwo(a, b): It can be stored as a RedisAI script using the CPU device with [`redis-cli`](https://redis.io/topics/rediscli) as follows: ``` -$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG SOURCE myscript:v0.1 +$ cat addtwo.py | redis-cli -x AI.SCRIPTSET myscript addtwo CPU TAG myscript:v0.1 SOURCE OK ``` @@ -514,9 +514,9 @@ The **`AI.DAGRUN`** command specifies a direct acyclic graph of operations to ru It accepts one or more operations, split by the pipe-forward operator (`|>`). -By default, the DAG execution context is local, meaning that loading and persisting tensors should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace. +By default, the DAG execution context is local, meaning that tensor keys appearing in the DAG only live in the scope of the command. That is, setting a tensor with `TENSORSET` will store it local memory and not set it to an actual database key. One can refer to that key in subsequent commands within the DAG, but that key won't be visible outside the DAG or to other clients - no keys are open at the database level. -When `PERSIST` is not present, object savings are done locally and kept only during the context of the DAG meaning that no output keys are open. +Loading and persisting tensors from/to keyspace should be done explicitly. The user should specify which key tensors to load from keyspace using the `LOAD` keyword, and which command outputs to persist to the keyspace using the `PERSIST` keyspace. As an example, if `command 1` sets a tensor, it can be referenced by any further command on the chaining. @@ -611,21 +611,21 @@ The following example obtains the previously-run 'myscript' script's runtime sta ``` redis> AI.INFO myscript - 1) KEY + 1) key 2) "myscript" - 3) TYPE + 3) type 4) SCRIPT - 5) BACKEND + 5) backend 6) TORCH - 7) DEVICE + 7) device 8) CPU - 9) DURATION + 9) duration 10) (integer) 11391 -11) SAMPLES +11) samples 12) (integer) -1 -13) CALLS +13) calls 14) (integer) 1 -15) ERRORS +15) errors 16) (integer) 0 ``` diff --git a/docs/configuration.md b/docs/configuration.md index 441c8fc3a..5043f1b38 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -151,6 +151,8 @@ redis-server --loadmodule /usr/lib/redis/modules/redisai.so \ ### THREADS_PER_QUEUE The **THREADS_PER_QUEUE** configuration option controls the number of worker threads allocated to each device's job queue. Multiple threads can be used for executing different independent operations in parallel. +Note that RedisAI maintains one job queue per device (CPU, GPU:0, GPU:1). Each job queue is consumed by THREADS_PER_QUEUE threads. + This option significantly improves the performance of simple, low-effort computation-wise models since there is spare computation cycle available from modern CPUs and hardware accelerators (GPUs, TPUs, ...). _Expected Value_ diff --git a/docs/developer.md b/docs/developer.md index 087439b09..96f345a3b 100644 --- a/docs/developer.md +++ b/docs/developer.md @@ -6,7 +6,6 @@ The following sections discuss topics relevant to the development of the RedisAI RedisAI bundles together best-of-breed technologies for delivering stable and fast model serving. To do so, we need to abstract from what each specific DL/ML framework offers and provide common data structures and APIs to the DL/ML domain. - As a way of representing tensor data we've embraced [dlpack](https://github.com/dmlc/dlpack) - a community effort to define a common tensor data structure that can be shared by different frameworks, supported by cuPy, cuDF, DGL, TGL, PyTorch, and MxNet. **Data Structures** @@ -14,8 +13,8 @@ As a way of representing tensor data we've embraced [dlpack](https://github.com/ RedisAI provides the following data structures: * **Tensor**: represents an n-dimensional array of values -* **Model**: represents a frozen graph by one of the supported DL/ML framework backends -* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) +* **Model**: represents a computation graph by one of the supported DL/ML framework backends +* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program ## Source code layout @@ -33,7 +32,6 @@ of complexity incrementally. **redisai.c** - This is the entry point of the RedisAI module, responsible for registering the new commands in the Redis server, and containing all command functions to be called. This file is also responsible for exporting of Tensor, Script and Model APIs to other Modules. **tensor.h** diff --git a/docs/images/graph.pb.png b/docs/images/graph.pb.png new file mode 100644 index 0000000000000000000000000000000000000000..6b46f676d50394c76d5187ef721383ea994203a1 GIT binary patch literal 8080 zcmbuE$^THLR(WA50?rT1OnlyKoxX>YXWdAJ;4Ib85~!S1F*ZU z@+(jkih2(iDA*XO*lK8iIDz{oAaIyH2;*@Ia8LmU2!x&s2B8C2@Z)c}X#e*#As7As z?w1}XdV%<`K_J>26$LpxAMjxo&XwNei$45X1%`x!{>NxeHO)Dq(>mU1I1c-RM1oF8`1+_qn~UoSVer(G{gxF0Q@$&L2K{1R1!m671bdQQzeN0R@aSQhr(n7PR_WHpz~?N*QJC^N+mlvfj^nmFs!FQ9uj4k1%HAeP{#RQNcs3$dj$l$&U8#gO87#MW!r;yg$qlu4=aX8%XVP(4 z`tBOp`zBFCRL6VCH7tLqJmOWbg5u@wxP06F<)TBD=ZsFOsar&d+@eA`hWx0^-^?;t zacs+fRjpHAi_Uhf6f7fg48JOyRyg^F_KI_=p1-_1PS_?-xk`Qgk%RAGso8zHDC1q( z7ks)Z12fO*YIaT%=--tO-&)R4Qqv}mm4)xx@4djVw+<{nB|LVuMf}d+Qc|C8SNy!$ zXF-i?o(kCx$Ipb3HwFHCC=0l~^b&kN@HtpUhR{Otp!(O8O{y8$X}zw5^0(psqvjpj zJ-*xXJ(w3kC#J?dU>A;n>))NB@|00S8XfKz(}rBVoXT_o^~ChAyy_uAc3a7YDOE>p ze^;K8+H2A;d#?seXbG||0PB(`uJKGXUTogUE=ShNUT-GuxBmHAm+U!k>U+JFI?a~j z&-g~OH=B1{>4h`1knNy?3xQHF#45LkBxhU-=8_r?{*99QQ}J+jrbQ{#tk-BYcsd2s zQmNO>l2nwv4Z0<+C|V6hxpUSB<}GJ!Dn%`@8II z#$_V6l?Trdw2AB%kS0wE`W&Vs;naSAiM#a{`2#)p{9^XC8exWrqitTvXS?xwdsu|^ zps^3`yKac%NUm3m$P7VS=8M8IbKKBSeqU^~h$_4?c+YRD+N^b^CVx8pRV z$H}&Kixt1XZ5D#KuAAuK#)Z8g587;PWaOGqF6RW}7nGeI|BFn53Y3~!MgtpNaXBa1QiL8)GC!q^C*Xh9)%8jLw zO8=c;&g6_3^UwA_9YD*M$CuF?Nf%MeGYABXK1KWCfWfw|X~|7Zg;LdI9q`5K1;76) zNrSqm)*EKcVc4`jx#4RSbtqlBk5_r52!#bk%Wcb`+S{Yn;#8yE1>Wz!(ZD2LJVw%1 zJL{#FL>+Dy!PvCWj9*rrk47s8wiJ9MTLwPtY|`MMe;e%pWtbqt5t;*n+iO`{ndrN~ z+e5L>EMSH52I~Q$>%(>jK5zwjopYISja@XZrK3wIqceh?KjkA|+j)Xhu`jT7r{i8e z7#03&?9@v*wP-aUNHbt>gmjrE8Dl8?RfI*QVO&o4hI56x(ivo8xH-QSmdA$gr$XV( z6`Wwg@^0k5Rgt4Q!7fTVJ`!Ech?`Yd&Khfe?)zdcv=8cGqn#n@Oj0L`&e(-)_Kh@0 zBW6|CJ{TTh{z5(~eHM)|zd%mT5wD(CtdYSgg!PMBr~wE`X+XHm!;rl_5SniL;N5d% z{Ul1wi@A;*VO+0+serVx&_#GT(@1*6%e+U%H#*L0HW~`4WeC~D&V8J8$>5m_E<6dw zR0E|6Swj+Ia0DMHqoe0FtsUb`3wVem@p@G}oPXc}21Nutv0;8rI4pb0uhtsEk~a3F zpZwKJ?`eI*fv6iG;6(b_WH>kK+}~Uzw%s;%6Wd2Av(9^9>qZH?tDv8ml?0po?CYDo)HC^JsOoX<*a$Pfx*j}D|N zOaK-43_*gwL%*8^t4o-1zmc^u`Eb}^lOZ|;<^A?@o9PdPJ`YdvBGw(9oJQma@M>Kx z*T0DF4kRU+*_U#qzQp*3DVP6xR#lPz)PzlY`4VE4(ft#oq+Q@lDq zT80~Su9X`*@rbz~Rln|bhp}ht?&fu|v&DyhGK57BP6>evN|JN_vr%r}AJ4ESmYmWrGHTnfg}ADigZcJ8vjY;4UU8;dU!V@FWs`#X=c!jhb9N4-2 z)1E3nTRIi^61*%APTxgrIlXh&o<;_0cX=G zJDAL{<5F00q&FPCow2FL1%2B6ljkE_aEXz>s1?lm!!IVy27Wb5b!!8HQ-rS^L=ZW8KE^1i5V*V}Knckxa)C3w zxlO-o0|IU0=IV)N1sV=^+FR2Tfry=FTH`7pt;R7Gwk&4D#n%SGpYZsVjw+g$fh5uB zTRC^fZ~Ml-JA`Mke^AaK7$cOrDb^h4+qrO0HW0`U_R=a{)s^SHW{pI?-tq|@zA&k{ zLWyrgyIH^1@kJp8mr3 z8W5a7P9C`)8=ghU$ENSBTa9Kg_h%Id4B_e1yH}tt`-Us*OObntfDj7cyRk&Ux z0bwn}p^bi*QZ$8Sh*jXUW&;NQLao5aQ%TW)*r?+&nr}tt%^iCmcpu)AL7%AQnB}}n z+eJ|X3AyJLPRyk64%U?j+Ow5+O=?OVGM`*vMN9#bn8os@de5l-rzSd1Y3vIzUKzNh zXz?;rRFE;%Sf3B6zH60YS1KW2SmQhO@%8-7t-!umnm1o@U1_0uH)?!sN!cTL^$Pv| z=f*ZwmifO=b1yqEFg_yFEQz{YocE9umxq|ehE-2$KtyPT*J2}W-~`ciH$PCiRw0!} z!&*f7Wm?+AbK|p!qhTS%>-eeRlqB=Od(XmYQ0;siUSMWKNa`&8#74;ve-^AB9&ZZP zzAL}({{jit##6q&0OU^Lc}qHy<6Gl`gIrvSOocV}zv(YR)JVskf`ibEmw~dCwy4c| z4Na%uPg$P41fBf1Ac^7>XH&nMKlO&M+293EQ@?lMafzEhOk>&l=4iBSrzH5cfywq_1XZ3Zj8t94k+WSN?iP@_G&)l_a{GGfao5COBLBIQ-n;83B^nJDh zGq;GK_Hv~_;y%5_Y9-8Q5b?eVA(&;9E91t?t_=tZJwo1Dd-LW|7SlIa1Fs6Y2yyEbg2G3Vs14QSU8GZL74HQYn5n-)Iy&${jdzpnfk;DO0@KL!u>Q(DDQh{qp zzEtxhgJz;*&&y^i3VLov9W5P?8jahykwJH(=}F>LScTMVw)X`ko-#8D`*ZlZy+%W! z4+Ie25HSWvq7(5@*wvrPsb%8Vtk|qqAL+vUlW*9mR{x#7 zqGWz(K7qzb1>KJCG_8!q>FY|^3U?Zuvb1>+9Zf?IADRBdc49IPd8}*;G($~jO*mU* zb|cubwV#r4$FDw=i{t1C$Jc+2sOO2Z4~RyB5GzzXfemzO#0TFLRde%$yCWj;Wv_C> zSwDE$2K7)trDFJW{bkb;5JhwxRt?9*QodL1)+m3G}Qr!=-Cg+ z5XPWSUvU}*<%0UShpB?@xgz&hDP}3*Cr$yAKvhao7Fw2VViU%U26ZZ<7IiX8PVe3r zz^3!pX5qfY5A_^2WoXo`)6aCCrZyLMU03*QyExfS1jUHA@C1~X{$+QXoR0>CU)jHQ zgOq>m10*LS|J;~fL{VevJ`z1)9{o$Bie+NwDaFM77^Fj2NyUI%XkAd7-)ec`=AQF7!~Rk-a)ItR7c znnB+T8HZE4cII;$$tn9;8BNIrPB9C&t(SJFOw%SJx+DKtTU(8YZybG?zq#X&d%-LL zHp|CE!%H_nHb{RdEDdFqZ z;^a|e1A45#JMmVI*M4!KBsiAEmXVa?RM{JDF{JU)+Qy=t+IU(`{_mHJal4KWhxpn_ zCQXH|Vg%_-m`J{xk@yh*%nGM~UP+a(R~+-N@Iis|>dyd<+7hI2R(6AabP{n# zIJPbt=R8&vEmX81Ge}xGo=AH@)aV>i%dmy9xfYxhq}te-kDY+W!A6gdUx1-ki$^9C z?*x?q`})wcao1CagSi?J16mZ<@$_bC z8HRNHLt-C87YRv;7j(6N6qkZ9~pPSWKn@)h?Df7(>z;ybSPi)$x0 z7XM+SfVnxC==cHQlOY&R%<9^TJ+8XOf+En|Tn+fU+}-RTdKYFU6JJ|^{=RX}>gmM* zTaNXzj%t(AKCl(OksYB^d&il^;^vdKoKL8g+h{m$_9;zwCp&Jc+Yr~5;$b=Nv&JRCcqKcS1e=5O;8!0S?ApfnpBXewqMVsN`T_{ z*9GGcbC8hwamSGBDfLK?nq{c!LW4$*tQn`pU=oL@&xxn0c?F%=PdV_LbBI2YdCm|0AiT923l|ahWXA zsHe0oAd`r<*o+oJX}j;B^g$(n$W-MzpI~FK-8>9TF+u~z`T&w^%EU#vl2#DrH}GL9 z;xhNj7H4^-rwhFKQZIXd`&Vn~#Hg?&ccJKd^yR;j4}aPWV`ybedbOKYKejHo_R%*T5NJ@@a$$+C8dnt9i;U? zmP=!)z8ZjxCQ1>BuYDp;GM7qnhx@F3E$R>Q0L(PdLc(^J>wU7Bf+1eh3OJ`*F8$zi z2|ijR26QNe2bRdHks$jWSlBz2u~b$bu!kBo>esaw09K%3$|p=j^selyF75Y4>B9Ew zch~1SPk2!FNLcm?umb3OI0p1=XEgOy^16B*wD}?;qLEf$I75t2*IogQ*`4HVX+tr> zEqkmxT2Oo+?->FV5u91{bQ0{Kk|7@g8D@WF0rotGrr|#sqBN*K#=05-s8>*#n*C1@> z&)B8nWgo8INcdj?yn(Nk|CFZ#ungM`5@4n>{DY#4PE3*wKnO4OyVx(xDdjM~>7;n5 zwwfc8Z~jD<)?;rDnyCrd1qf)e2HHXJkbhB;kDPrE3%3!`^OT=+dkz98kr9bJwMsEm zrIMZrXW4)>Y6+1G#VQ4=IxS#R89hSutPe-ck;nXgJ6Yb=cT<407_TS(S5{tyOO2!h zQ2B&LUM|zOfXjL1bzzqkQyTYS0jr_JH1g%Nn$=zfbPa(V_`7D7`?)B-pX`=)S)K-x z6-7R5Ssn@9lt6Zr)ZE5rCEH9)v)#UT`{#9;Y#hE z0`TM2{lo2_y(=RbNG}y)b*^OcV!VN}jKjR#pt3vPW3wOe$YohgZU%cbCWJG@CdH_) z*PNoJ$_@Fzgy}}l{!m6U^*dHIE9qRzg}QV8ceN2`7___lHO@fD4n-3G1IVyt_LD^u zEyvr~g!El^Ta#KZ^&k0J3!udG0X$|i^H~7NaLWJN_D)`{7n|uPe;Ehu7G?+FjOD#A zY^=N@8QROE)-};0l>6#w57eni-JXCnX#~8FM2p@BSqxKWSsKpZM{9Tn%NJ`tHedSz(<|fbQQFL zcdquy$h`D-F$SXHb}zOD+3?wd>$g(is`Gp6NGm(ei|;2u>g@kn>4dcx$*P%I(&BYk zMxoY&*j*6}wr-I~Z57!(u%4Z1@pJ^^^E~@WQx`#LpzR&%Ds_H$HfpDg_&3}LR8trI z2gzUw*!?RG?v+Oty)3#R?qj!|CPBa_%m*Kp!YR-#3Y( zx}y+WW#HKX6?z6?;bw|*KJy<&MQ!W{6t}q%SZ{~BYH{e4j4&al#=?hVd7-8;qhISr zW#KEw2j4&6u>}sI=LtgE;!7O|lMQ~J3?GaP11a52MVM(RI&gS;$qFLvFvSi=r8`!Y zl5rc+-_UvR)T0Ui)%KF+TmzCGto5)lql7>k$6^qCJm=~FVnGP4+ZF!`@X*#ClLPHT~pxF!N68ny_KeJGx2ydXwp|7_8Fm8c<~@Dswi=~un*&;0{agwIFw3!p)t zCZEiNAlvwnZx#+q-{UBlHOsMd+<@2Bli4|lWQbS`+0hw_4zib$KwbtV@?&;+ucHze z!9%%}!`kW1Eds}>9&E9Tl%0dAdtai1$%ov5btN}7s zS|H+P&xzD2Kwd|8`o%Mt*h>GCGQk;3uJH{X!8m9ED;$7$r}I7|&~aQXSf%{0-d`VQ zh;YrIj{iQ(&7ZMUfMcF$#wZ@S{Jd5YWkzNRb92`98V$~Ss03M;p?#T~eEIRtN56mV z7wgaV8BH|qJ6fO>Il+a&LqE%VQC6dd%bZ9WRfSXlaaZ*Uq&ENg2Ee%twsW2OhqRB_ z`9loiEL;#=)Iyj+0tRDNgMNaAG~Z{H4)juf-|J#`Xa_n+n1%m)OQK^4r@HyW9yM?? h$Us47io|%p{>!{p{5h8A(_=SWMNw0s>Xk*v{{VU$bbtT= literal 0 HcmV?d00001 diff --git a/docs/index.md b/docs/index.md index 2999c9ffc..8a9c9c47f 100644 --- a/docs/index.md +++ b/docs/index.md @@ -4,15 +4,16 @@ RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure, as well as maximizes computation throughput by adhering to the principle of data locality. +RedisAI is a joint effort between [Redis Labs](https://www.redislabs.com) and [Tensorwerk](https://tensorwerk.com). + ## Where Next? * The [Introduction](intro.md) is the recommended starting point * The [Quickstart](quickstart.md) page provides information about building, installing and running RedisAI - * The [Commands](commands.md) page is a reference of RedisAI's API + * The [Commands](commands.md) page is a reference of the RedisAI API * The [Clients](clients.md) page lists RedisAI clients by programming language * The [Configuration](configuration.md) page explains how to configure RedisAI * The [Developer](developer.md) page has more information about the design and implementation of the RedisAI module - ## Quick Links * [Source code repository](https://github.com/RedisAI/RedisAI) * [Releases](https://github.com/RedisAI/RedisAI/releases) diff --git a/docs/intro.md b/docs/intro.md index 5b3302d54..0fd3c8094 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -31,10 +31,10 @@ In broad strokes, RedisAI's looks as follows: | | | myscript +----->+ Script +---+ +----+-----+ +-->+ ONNXRuntime | | | | | | | | | +--------+ | ^ | +-------------+ | | | | | +----------+ | | | | | | | -| | | | | +--------+ | | | +-------------+ | | | -| | | mydag +----->+ DAG +---+ | +-->+ ... | | | | -| | | | | +--------+ | +-------------+ | | | -| | +----------+ ^ +------------------------|-----------------------------+ | | +| | ^ | +--------+ | | | +-------------+ | | | +| | | | + DAG +---+ | +-->+ ... | | | | +| | | | +--------+ | +-------------+ | | | +| | | +------------------------|-----------------------------+ | | | +--------------|--------------------------|-------------------------------+ | | v v | | +--------------+-----------------+ +------------------------------------+ | @@ -83,22 +83,22 @@ A **Redis module** is a shared library that can be loaded by the Redis server du * [Modules published at redis.io](https://redis.io/modules) ### Why RedisAI? -RedisAI bundles together best-of-breed technologies for delivering stable and performant graph serving. Every DL/ML framework ships with a backend for executing the graphs developed by it, and the common practice for serving these is building a simple server. +RedisAI bundles together best-of-breed technologies for delivering stable and performant computation graph serving. Every DL/ML framework ships with a runtime for executing the models developed with it, and the common practice for serving these is building a simple server around them. RedisAI aims to be that server, saving you from the need of installing the backend you're using and developing a server for it. By itself that does not justify RedisAI's existence so there's more to it. Because RedisAI is implemented as a Redis module it automatically benefits from the server's capabilities: be it Redis' native data types, its robust eco-system of clients, high-availability, persistence, clustering, and Enterprise support. -Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored memory space of the Redis server they are readily accessible to any of RedisAI's backend libraries at minimal latency. +Because Redis is an in-memory data structure server RedisAI uses it for storing all of its data. The main data type supported by RedisAI is the Tensor that is the standard representation of data in the DL/ML domain. Because tensors are stored in the memory space of the Redis server, they are readily accessible to any of RedisAI's backend libraries at minimal latency. The locality of data, which is tensor data in adjacency to DL/ML models backends, allows RedisAI to provide optimal performance when serving models. It also makes it a perfect choice for deploying DL/ML models in production and allowing them to be used by any application. -Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple graphs and, in future versions, assessing their respective performance in real-time. +Furthermore, RedisAI is also an optimal testbed for models as it allows the parallel execution of multiple computation graphs and, in future versions, assessing their respective performance in real-time. #### Data Structures RedisAI provides the following data structures: * **Tensor**: represents an n-dimensional array of values -* **Model**: represents a frozen graph by one of the supported DL/ML framework backends -* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) +* **Model**: represents a computation graph by one of the supported DL/ML framework backends +* **Script**: represents a [TorchScript](https://pytorch.org/docs/stable/jit.html) program #### DL/ML Backends RedisAI supports the following DL/ML identifiers and respective backend libraries: @@ -135,7 +135,7 @@ docker exec -it redisai redis-cli ``` ## Using RedisAI Tensors -A **Tensor** is an n-dimensional array and is the standard vehicle for DL/ML data. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names. +A **Tensor** is an n-dimensional array and is the standard representation for data in DL/ML workloads. RedisAI adds to Redis a Tensor data structure that implements the tensor type. Like any datum in Redis, RedisAI's Tensors are identified by key names. Creating new RedisAI tensors is done with the [`AI.TENSORSET` command](commands.md#aitensorset). For example, consider the tensor: $\begin{equation*} tA = \begin{bmatrix} 2 \\ 3 \end{bmatrix} \end{equation*}$. @@ -154,7 +154,7 @@ Copy the command to your cli and hit the `` on your keyboard to execute i OK ``` -The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a double-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2. +The reply 'OK' means that the operation was successful. We've called the `AI.TENSORSET` command to set the key named 'tA' with the tensor's data, but the name could have been any string value. The `FLOAT` argument specifies the type of values that the tensor stores, and in this case a single-precision floating-point. After the type argument comes the tensor's shape as a list of its dimensions, or just a single dimension of 2. The `VALUES` argument tells RedisAI that the tensor's data will be given as a sequence of numeric values and in this case the numbers 2 and 3. This is useful for development purposes and creating small tensors, however for practical purposes the `AI.TENSORSET` command also supports importing data in binary format. @@ -188,7 +188,7 @@ A **Model** is a Deep Learning or Machine Learning frozen graph that was generat Models, like any other Redis and RedisAI data structures, are identified by keys. A Model's key is created using the [`AI.MODELSET` command](commands.md#aimodelset) and requires the graph payload serialized as protobuf for input. -In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb). +In our examples, we'll use one of the graphs that RedisAI uses in its tests, namely 'graph.pb', which can be downloaded from [here](https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb). This graph was created using TensorFlow with [this script](https://github.com/RedisAI/RedisAI/blob/master/test/test_data/tf-minimal.py). ??? info "Downloading 'graph.pb'" Use a web browser or the command line to download 'graph.pb': @@ -197,6 +197,12 @@ In our examples, we'll use one of the graphs that RedisAI uses in its tests, nam wget https://github.com/RedisAI/RedisAI/raw/master/test/test_data/graph.pb ``` +You can view the computation graph using [Netron](https://lutzroeder.github.io/netron/), which supports all frameworks supported by RedisAI. + +![Computation graph visualized in Netron](images/graph.pb.png "Computation Graph Visualized in Netron") + +This is a great way to inspect a graph and find out node names for inputs and outputs. + redis-cli doesn't provide a way to read files' contents, so to load the model with it we'll use the command line and output pipes: ```