Skip to content
This repository was archived by the owner on Sep 23, 2025. It is now read-only.

Conversation

@harborn
Copy link
Contributor

@harborn harborn commented Feb 5, 2024

No description provided.

gpt_base_model: true
output_dir: /tmp/llm-ray/output
checkpoint_dir: /tmp/llm-ray/checkpoint
tracking_dir: /tmp/llm-ray/tracking
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are 9 files in the models directory. Do bloom-560m.yaml, finetune_config_template.yaml, gpt2.yaml, llama-7b.yaml and opt-125m.yaml also need to be modified?

@jiafuzha
Copy link
Contributor

Please ignore mpt-7b-bigdl test failure. It's bigdl issue. I've reported it in intel/ipex-llm#10177.

Copy link
Contributor

@carsonwang carsonwang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@harborn
Copy link
Contributor Author

harborn commented Feb 23, 2024

no comments

@harborn harborn merged commit 535de7d into intel:main Feb 23, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants