You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: recipes_source/recipes/tuning_guide.py
+20-2Lines changed: 20 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -240,7 +240,7 @@ def fused_gelu(x):
240
240
# Use oneDNN Graph with TorchScript for inference
241
241
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
242
242
# oneDNN Graph can significantly boost inference performance. It fuses some compute-intensive operations such as convolution, matmul with their neighbor operations.
243
-
# Currently, it's supported as an experimental feature for Float32 data-type.
243
+
# In PyTorch 2.0, it is supported as a beta feature for Float32 & BFloat16 data-types.
244
244
# oneDNN Graph receives the model’s graph and identifies candidates for operator-fusion with respect to the shape of the example input.
245
245
# A model should be JIT-traced using an example input.
246
246
# Speed-up would then be observed after a couple of warm-up iterations for inputs with the same shape as the example input.
0 commit comments