When calling Module.to(ScalarType) on a module, PyTorch has a restriction that the target dtype must be a floating point or a complex number. See here.
Also, when it gets a floating point type, PyTorch only moves parameters that are already a floating_point type to the new type. See for example here.
This is relatively trivial to add, would TorchSharp want to add the same restrictions?