Skip to content

Commit 6930c3c

Browse files
committed
feat: airbox q900 ai dev en docs
1 parent 99aaef8 commit 6930c3c

File tree

67 files changed

+4426
-185
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

67 files changed

+4426
-185
lines changed

i18n/en/docusaurus-plugin-content-docs/current/common/ai/_aimet.mdx

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
[AIMET](https://github.com/quic/aimet) (AI Model Efficiency Toolkit) is a quantization tool for deep learning models such as PyTorch and ONNX. AIMET enhances the performance of deep learning models by reducing computational load and memory usage.
22

3-
With AIMET, developers can quickly iterate to find the optimal quantization configuration, achieving the best balance between accuracy and latency. Developers can compile and deploy quantized models exported from AIMET on Qualcomm NPUs using [QAIRT](./qairt-usage), or run them directly with ONNX-Runtime.
3+
With AIMET, developers can quickly iterate to find the optimal quantization configuration that balances accuracy and latency. The quantized models exported by AIMET can be compiled and deployed on Qualcomm NPUs using [QAIRT](./qairt-usage), or run directly with ONNX-Runtime.
44

55
AIMET helps developers with:
66

77
- **Quantization simulation**
88
- **Model quantization using Post-Training Quantization (PTQ) techniques**
99
- **Quantization-Aware Training (QAT) on PyTorch models using AIMET-Torch**
10-
- **Visualizing and experimenting with the impact of activation values and weights on model accuracy at different precisions**
10+
- **Visualizing and experimenting with the impact of different precision settings on activation values and weights**
1111
- **Creating mixed-precision models**
1212
- **Exporting quantized models to deployable ONNX format**
1313

@@ -32,12 +32,12 @@ AIMET requires a Python 3.10 environment, which can be created using [Anaconda](
3232

3333
:::tip
3434

35-
- For Anaconda installation, refer to: [**Conda Install**](../virtual-env/conda_install)
35+
- For Anaconda installation, refer to: [**Conda Installation**](../virtual-env/conda_install)
3636

37-
- For creating a conda Python environment, refer to: [**Create Environment with Specific Python Version**](../virtual-env/conda_use#create-environment-with-specific-python-version)
37+
- For creating a conda Python environment, refer to: [**Creating a Specific Python Version Environment**](../virtual-env/conda_use#creating-a-specific-python-version-environment)
3838
:::
3939

40-
After installing Anaconda, create and activate a Python 3.10 environment using the terminal:
40+
After installing Anaconda, use the terminal to create and activate a Python 3.10 environment
4141

4242
<NewCodeBlock tip="X86 Linux PC" type="PC">
4343

@@ -62,7 +62,7 @@ AIMET provides two Python packages:
6262

6363
</NewCodeBlock>
6464

65-
- **AIMET-Torch**: Perform QAT on PyTorch models
65+
- **AIMET-Torch**: Performs QAT on PyTorch models
6666

6767
<NewCodeBlock tip="X86 Linux PC" type="PC">
6868

@@ -74,7 +74,7 @@ AIMET provides two Python packages:
7474

7575
- Install jupyter-notebook
7676

77-
AIMET examples are provided as **jupyter-notebook** references. You need to install the jupyter kernel for the aimet Python environment.
77+
AIMET examples are provided as **jupyter-notebooks**, so we need to install jupyter kernel for the aimet Python environment
7878

7979
<NewCodeBlock tip="X86 Linux PC" type="PC">
8080

@@ -87,11 +87,11 @@ AIMET provides two Python packages:
8787

8888
## AIMET Usage Example
8989

90-
This example demonstrates PTQ (Post-Training Quantization) using the PyTorch [ResNet50](https://docs.pytorch.org/vision/main/models/generated/torchvision.models.resnet50.html) object detection model, which is first converted to ONNX format and then quantized using AIMET-ONNX.
91-
For implementation details, please refer to the ResNet50 example [**notebook**](https://github.com/ZIFENG278/resnet50_qairt_example/blob/main/notebook/quantsim-resnet50.ipynb).
90+
This example uses PyTorch's [resnet50](https://docs.pytorch.org/vision/main/models/generated/torchvision.models.resnet50.html) object detection model, converting it to ONNX format and then performing PTQ quantization using AIMET-ONNX.
91+
For implementation details, please refer to the resnet50 example [**notebook**](https://github.com/ZIFENG278/resnet50_qairt_example/blob/main/notebook/quantsim-resnet50.ipynb)
9292

9393
:::tip
94-
The model exported in this example can be used for NPU porting of AIMET quantized models in the [**QAIRT SDK Usage Example**](./qairt-usage#quantizing-models-with-aimet).
94+
The model exported in this example can be used in the [**QAIRT SDK Example**](./qairt-usage#using-aimet-for-model-quantization) for NPU porting of the AIMET quantized model.
9595
:::
9696

9797
### Prepare the Example Notebook
@@ -129,11 +129,11 @@ wget https://github.com/ZIFENG278/resnet50_qairt_example/raw/refs/heads/main/not
129129

130130
#### Download the Dataset
131131

132-
Prepare a calibration dataset. To reduce download time, we'll use [ImageNet-Mini](https://www.kaggle.com/datasets/ifigotin/imagenetmini-1000) as a substitute for the full [ImageNet](https://image-net.org/download.php) dataset.
132+
Prepare a calibration dataset. To reduce download time, we'll use [ImageNet-Mini](https://www.kaggle.com/datasets/ifigotin/imagenetmini-1000) as a substitute for [ImageNet](https://image-net.org/download.php).
133133

134134
- Download the ImageNet-Mini dataset from [Kaggle](https://www.kaggle.com/datasets/ifigotin/imagenetmini-1000)
135135

136-
### Run the Example Notebook
136+
### Execute the Example Notebook
137137

138138
#### Start jupyter-notebook
139139

@@ -147,14 +147,14 @@ jupyter-notebook
147147
</NewCodeBlock>
148148

149149
:::tip
150-
After starting jupyter-notebook, it will automatically open in your default browser. If it doesn't open automatically, click on the URL printed in the terminal.
150+
After starting jupyter-notebook, it will automatically open in your default browser. If it doesn't open automatically, you can click on the URL printed after startup.
151151
:::
152152

153-
#### Change the Kernel
153+
#### Change Kernel
154154

155155
On the jupyter-notebook homepage, select `/Examples/onnx/quantization/quantsim-resnet50.ipynb`
156156

157-
In the notebook's menu bar at the top left, select `Kernel -> Change Kernel -> Select Kernel` and choose the `aimet` kernel created during the [AIMET installation](#aimet-installation).
157+
In the notebook's top-left menu bar, select `Kernel -> Change Kernel -> Select Kernel` and choose the `aimet` kernel created during [AIMET installation](#install-aimet).
158158

159159
<div style={{ textAlign: "center" }}>
160160
<img
@@ -174,7 +174,7 @@ DATASET_DIR = '<ImageNet-Mini Path>' # Please replace this with a real director
174174

175175
#### Run the Entire Notebook
176176

177-
In the notebook's menu bar at the top left, select `Run -> Run All Cells` to execute the entire notebook.
177+
In the notebook's top-left menu bar, select `Run -> Run All Cells` to execute the entire notebook.
178178

179179
<div style={{ textAlign: "center" }}>
180180
<img src="/en/img/dragon/q6a/run_notebook.webp" style={{ width: "100%" }} />
@@ -185,21 +185,21 @@ The exported `resnet50` model files will be saved in the `aimet_quant` folder, w
185185

186186
## Deploying AIMET Models
187187

188-
AIMET exports models from different frameworks into the specified file formats as shown in the table below:
188+
AIMET exports models from different frameworks into specific file formats as shown in the table below:
189189

190190
| Framework | Format |
191191
| ---------- | ---------- |
192192
| PyTorch | .onnx |
193193
| ONNX | .onnx |
194194
| TensorFlow | .h5 or .pb |
195195

196-
You can use the QAIRT tool to deploy the quantized output files from AIMET to edge devices. For the deployment process, please refer to:
196+
The quantized output files from AIMET can be deployed on target devices using QAIRT. For the deployment process, please refer to:
197197

198198
- [**QAIRT Model Conversion Example**](qairt-usage#model-conversion-example)
199199

200200
## Complete Documentation
201201

202-
For more detailed documentation about AIMET, please refer to
202+
For more detailed documentation about AIMET, please refer to:
203203

204204
- [**AIMET DOCS**](https://quic.github.io/aimet-pages/releases/latest/index.html#)
205205
- [**AIMET Repository**](https://github.com/quic/aimet)
Lines changed: 152 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,152 @@
1+
This document describes how to use the [QAI AppBuilder](../qai-appbuilder) Python API to run inference with the [AOT-GAN](https://aihub.qualcomm.com/models/aotgan) image inpainting model on Qualcomm® Hexagon™ Processor (NPU).
2+
3+
**Supported Devices**
4+
5+
| Device | SoC |
6+
| --------------------- | ------- |
7+
| Fogwise® AIRbox Q900 | QCS9075 |
8+
9+
## Install QAI AppBuilder
10+
11+
:::tip
12+
13+
1. Please install QAI AppBuilder according to the [**QAI AppBuilder Installation Guide**](../qai-appbuilder).
14+
15+
2. Configure QAIRT environment variables as described in [**Configure QAIRT Environment Variables**](../qai-appbuilder#configure-qairt-environment-variables).
16+
:::
17+
18+
## Run the Example
19+
20+
### Install Dependencies
21+
22+
<NewCodeBlock tip="Device" type="device">
23+
24+
```bash
25+
pip3 install requests tqdm qai-hub py3_wget opencv-python torch torchvision
26+
```
27+
28+
</NewCodeBlock>
29+
30+
### Run the Script
31+
32+
- Navigate to the example directory
33+
34+
<Tabs>
35+
36+
<TabItem value="QCS9075">
37+
38+
<NewCodeBlock tip="Device" type="device">
39+
40+
```bash
41+
cd ai-engine-direct-helper/samples/python
42+
```
43+
44+
</NewCodeBlock>
45+
46+
</TabItem>
47+
48+
</Tabs>
49+
50+
- Prepare input images. The following images are used as examples:
51+
52+
{" "}
53+
54+
<div style={{ textAlign: "center" }}>
55+
<img
56+
src="/en/img/fogwise/airbox-q900/qairt-aotgan_input.webp"
57+
style={{ width: "65%" }}
58+
/>
59+
input image
60+
</div>
61+
62+
<div style={{ textAlign: "center" }}>
63+
<img
64+
src="/en/img/fogwise/airbox-q900/qairt-aotgan_mask.webp"
65+
style={{ width: "65%" }}
66+
/>
67+
mask image
68+
</div>
69+
70+
- Run inference
71+
72+
<NewCodeBlock tip="Device" type="device">
73+
74+
```bash
75+
python3 aotgan/aotgan.py
76+
```
77+
78+
</NewCodeBlock>
79+
80+
```bash
81+
$ python3 aotgan/aotgan.py
82+
0.0ms [WARNING] <W> Initializing HtpProvider
83+
84+
/prj/qct/webtech_scratch20/mlg_user_admin/qaisw_source_repo/rel/qairt-2.37.1/point_release/SNPE_SRC/avante-tools/prebuilt/dsp/hexagon-sdk-5.5.5/ipc/fastrpc/rpcmem/src/rpcmem_android.c:38:dummy call to rpcmem_init, rpcmem APIs will be used from libxdsprpc
85+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
86+
87+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
88+
89+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
90+
91+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
92+
93+
136.6ms [WARNING] Time: Read model file to memory. 21.49
94+
95+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
96+
97+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
98+
99+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
100+
101+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
102+
103+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
104+
105+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
106+
107+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
108+
109+
205.5ms [WARNING] Time: contextCreateFromBinary. 68.83
110+
111+
205.6ms [WARNING] Time: UnmapViewOfFile. 0.00
112+
113+
207.5ms [WARNING] Time: model_initialize aotgan 207.45
114+
115+
415.4ms [WARNING] Time: model_inference aotgan 154.66
116+
117+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
118+
119+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
120+
121+
/usr/bin/xdg-open: 882: www-browser: not found
122+
/usr/bin/xdg-open: 882: links2: not found
123+
/usr/bin/xdg-open: 882: elinks: not found
124+
/usr/bin/xdg-open: 882: links: not found
125+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
126+
127+
0.0ms [WARNING] <W> This META does not have Alloc2 Support
128+
129+
/usr/bin/xdg-open: 882: lynx: not found
130+
/usr/bin/xdg-open: 882: w3m: not found
131+
xdg-open: no method available for opening '/tmp/tmpjvwp3wzi.PNG'
132+
/usr/bin/xdg-open: 882: www-browser: not found
133+
/usr/bin/xdg-open: 882: links2: not found
134+
/usr/bin/xdg-open: 882: elinks: not found
135+
/usr/bin/xdg-open: 882: links: not found
136+
/prj/qct/webtech_scratch20/mlg_user_admin/qaisw_source_repo/rel/qairt-2.37.1/point_release/SNPE_SRC/avante-tools/prebuilt/dsp/hexagon-sdk-5.5.5/ipc/fastrpc/rpcmem/src/rpcmem_android.c:42:dummy call to rpcmem_deinit, rpcmem APIs will be used from libxdsprpc
137+
801.6ms [WARNING] Time: model_destroy aotgan 67.93
138+
139+
/usr/bin/xdg-open: 882: lynx: not found
140+
/usr/bin/xdg-open: 882: w3m: not found
141+
xdg-open: no method available for opening '/tmp/tmp0z61xxdt.PNG'
142+
```
143+
144+
{" "}
145+
146+
<div style={{ textAlign: "center" }}>
147+
<img
148+
src="/en/img/fogwise/airbox-q900/qairt-aotgan_output.webp"
149+
style={{ width: "65%" }}
150+
/>
151+
output image
152+
</div>

0 commit comments

Comments
 (0)