Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama v0.1.34 Timeout issue on Codellama34B #4283

Open
humza-sami opened this issue May 9, 2024 · 0 comments
Open

Ollama v0.1.34 Timeout issue on Codellama34B #4283

humza-sami opened this issue May 9, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@humza-sami
Copy link

What is the issue?

I am trying to run Codellama34B model on Ollama 0.1.34 version and its keep giving me timeout error. Although I was able to run codellama70B on this version. Then I rollback ollama to v0.1.32 and it worked for me. It seems latest version is not supporting codellama34B.
image

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.34

@humza-sami humza-sami added the bug Something isn't working label May 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant