Skip to content

Conversation

@ubik2
Copy link
Contributor

@ubik2 ubik2 commented May 2, 2023

When loading a safetensors file, ignore the metadata header.
If no pt or pth files are available, attempt to load safetensors files.
Edit: This has been changed to try to load safetensors files first, and only load the pt/pth/bin file if those aren't available.

… or pth files are available, attempt to load safetensors files
convert.py Outdated
files = [file for glob in globs for file in path.glob(glob)]
if not files:
# Check if it's a set of safetensors files
globs = ["model-00001-of-*.safetensors"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just add "model-00001-of-*.safetensors" to the globs above?

Copy link
Contributor Author

@ubik2 ubik2 May 4, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would generally work as well. I was thinking to treat it the same as the ggml files, where it's lower priority than the pth files, in case both exist. If I simply add the safetensors pattern to the globs, and both are present, the convert script will abort, since it's unsure which one to use.
In any case, I'm happy to change it if you prefer.

Copy link
Contributor

@prusnak prusnak May 4, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, shouldn't the safetensors be preferred if anything?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That sounds good to me.
I initially wanted to minimize the behavior change, but I've updated the PR to try with the safetensors first.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants