Skip to content

Misc. bug: ValueError: Duplicated key name 'deepseek2.attention.key_length' #14093

@ChuanhongLi

Description

@ChuanhongLi

Name and Version

We need a fp16 DeepSeeek-R1 model, thus I have converted the fp8 Deepseek-R1 model to the bf16 model using the fp8_cast_bf16.py script(from the "inference" folder of deepseek-ai/DeepSeek-V3).

When using the convert_hf_to_gguf.py script to convert the bf16 model to fp16 model, I have met the following error:

Traceback (most recent call last):
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 6102, in <module>
    main()
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 6096, in main
    model_instance.write()
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 402, in write
    self.prepare_metadata(vocab_only=False)
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 486, in prepare_metadata
    super().prepare_metadata(vocab_only=vocab_only)
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 392, in prepare_metadata
    self.set_gguf_parameters()
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/convert_hf_to_gguf.py", line 4834, in set_gguf_parameters
    self.gguf_writer.add_key_length(hparams["kv_lora_rank"] + hparams["qk_rope_head_dim"])
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/gguf-py/gguf/gguf_writer.py", line 687, in add_key_length
    self.add_uint32(Keys.Attention.KEY_LENGTH.format(arch=self.arch), length)
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/gguf-py/gguf/gguf_writer.py", line 290, in add_uint32
    self.add_key_value(key, val, GGUFValueType.UINT32)
  File "/mnt/host_root/home/jovyan/lich/llama.cpp/gguf-py/gguf/gguf_writer.py", line 273, in add_key_value
    raise ValueError(f'Duplicated key name {key!r}')
ValueError: Duplicated key name 'deepseek2.attention.key_length'

Any idea about this? Can I just skip this check?

Thanks!

Operating systems

No response

Which llama.cpp modules do you know to be affected?

No response

Command line

python convert_hf_to_gguf.py /mnt/host_root/home/lich/gguf/DeepSeek-R1-bf16 --outfile /mnt/host_root/home/models/DeepSeek-R1-fp16.gguf --outtype f16

Problem description & steps to reproduce

python fp8_cast_bf16.py --input-fp8-hf-path /home/admin/models/deepseek-ai/DeepSeek-R1/ --output-bf16-hf-path /mnt/host_root/home/lich/gguf/DeepSeek-R1-bf16

python convert_hf_to_gguf.py /mnt/host_root/home/lich/gguf/DeepSeek-R1-bf16 --outfile /mnt/host_root/home/models/DeepSeek-R1-fp16.gguf --outtype f16

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions