aten.gelu + aten.tanh
-
Function Schema: torch.ops.aten.gelu.default: ((torch.float32,), {}), torch.ops.aten.tanh.default: ((torch.float32,), {})
-
Original PyTorch API: torch.gelu, torch.tanh
-
Relevant TensorRT Documentation: IActivationLayer
Add support for gelu and tanh as aten converters.