Skip to content

Conversation

@NiklasGustafsson
Copy link
Contributor

See #1180

@shaltielshmid
Copy link
Contributor

In PyTorch there is an additional restriction that when it does accept a floating point dtype, it only applies the move to parameters and buffers that are already floating points, see here.

@NiklasGustafsson
Copy link
Contributor Author

In PyTorch there is an additional restriction that when it does accept a floating point dtype, it only applies the move to parameters and buffers that are already floating points, see here.

Thanks for pointing it out. It's not exactly what the expression says, but it was something I missed.

@NiklasGustafsson NiklasGustafsson merged commit 089511d into dotnet:main Dec 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Module.to(ScalarType) has restrictions in PyTorch which aren't restricted in TorchSharp

2 participants