Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[URGENT] llama.cpp / GGUF breaks on Colab #476

Open
danielhanchen opened this issue May 16, 2024 · 12 comments
Open

[URGENT] llama.cpp / GGUF breaks on Colab #476

danielhanchen opened this issue May 16, 2024 · 12 comments
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster URGENT BUG Urgent bug

Comments

@danielhanchen
Copy link
Contributor

llama.cpp seems to be unable to install - investigating now.

image

@eugeniosegala
Copy link

Facing the same issue.

Is Unsloth always pulling the latest version of llama.cpp? If yes, would not be safer to lock it? Happy to help with that.

@danielhanchen
Copy link
Contributor Author

danielhanchen commented May 16, 2024

Sorry on the issue! Hmm I'm actually unsure what the issue is yet - some installs work, whilst some don't - I'm not sure what is causing it to fail

@danielhanchen
Copy link
Contributor Author

@eugeniosegala Found the issue - reporting to HuggingFace. It seems like PEFT got an update, weirdly causing stuff to break in Colab.

Please change the installation instructions to

%%capture
# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps "xformers<0.0.26" trl "peft<0.11.0" accelerate bitsandbytes

in the first cell

@kirilgorbachov
Copy link

Did not resolved the issue.
I am getting:
Screenshot 2024-05-17 at 1 31 14

Tried both:
"peft<0.11.0"

And manual reinstallation to
"peft==0.10.0"

Did not help.
P.S. Yesterday everything was working fine. Some dependencies.

@kirilgorbachov
Copy link

When run for the first time, tries to re-install llama.cpp
Screenshot 2024-05-17 at 1 44 39

@gaurav-nelson
Copy link

+1 Didn't resolve the issue. I still get the error TypeError: a bytes-like object is required, not 'str' in Colab notebook while running save_pretrained_gguf

@Jiar
Copy link

Jiar commented May 17, 2024

+1 Didn't resolve the issue. I still get the error TypeError: a bytes-like object is required, not 'str' in Colab notebook while running save_pretrained_gguf

the same as you

@danielhanchen
Copy link
Contributor Author

@kirilgorbachov @Jiar @gaurav-nelson So so sorry it was my dumb mistake!! I accidentally did a typo!!

If you're still on Colab, please uninstall unsloth or edit the top cell to

# %%capture
# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip uninstall unsloth -y
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps "xformers<0.0.26" trl "peft<0.11.0" accelerate bitsandbytes

then restart and run all

Sorry again!

@eugeniosegala
Copy link

@danielhanche, It's working now! Thanks 🙏

@gaurav-nelson
Copy link

Thank you @danielhanchen 🥇 It is working now.

@kirilgorbachov
Copy link

@danielhanchen I do not actually see your spelling mistake, but it defiantly fixed now and function properly.

I tried to debug it a bit yesterday, seems unsloth installs llama.cpp via git clone ... master of their repo. I actually believe that they had some issue for a while, and now it's resolved. Maybe versioning / release tagging would be a good idea.

@danielhanchen Thanks for your time anyway!

@danielhanchen
Copy link
Contributor Author

As an update, the super speedy HF team have isolated the issue and have swiftly found a fix :) Hats off to them for their fabulous speed! huggingface/peft#1739

@danielhanchen danielhanchen unpinned this issue May 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fixed - pending confirmation Fixed, waiting for confirmation from poster URGENT BUG Urgent bug
Projects
None yet
Development

No branches or pull requests

5 participants