### Prerequisites - [X] I am running the latest code. Mention the version if possible as well. - [X] I carefully followed the [README.md](https://github.com/ggerganov/llama.cpp/blob/master/README.md). - [X] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed). - [X] I reviewed the [Discussions](https://github.com/ggerganov/llama.cpp/discussions), and have a new and useful enhancement to share. ### Feature Description Can you add support for InternLM 2.5 1M because is working but responses are terrible. https://huggingface.co/internlm/internlm2_5-7b-chat-1m ### Motivation to work properly ### Possible Implementation _No response_