Skip to content

Conversation

@sfc-gh-zhwang
Copy link
Collaborator

@sfc-gh-zhwang sfc-gh-zhwang commented Oct 8, 2023

When prompt length is very long, the shared memory size grow to more than 48kb, which is the default value in cuda. We need to manually set the shared memory size.

https://leimao.github.io/blog/CUDA-Shared-Memory-Capacity/#:~:text=However%2C%20CUDA%20shared%20memory%20has,is%2048%20KB%20by%20default.

cc @sfc-gh-vichan @neevaco/corvo

@sfc-gh-zhwang sfc-gh-zhwang changed the title Zhwang/fix code llama long prompt TIL: fix trash output for code llama for long prompt Oct 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants