Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]: Certain FLUX LoRAs install, but fail to run #7134

Closed
1 task done
rikublock opened this issue Oct 17, 2024 · 2 comments · Fixed by #7577
Closed
1 task done

[bug]: Certain FLUX LoRAs install, but fail to run #7134

rikublock opened this issue Oct 17, 2024 · 2 comments · Fixed by #7577
Labels
bug Something isn't working

Comments

@rikublock
Copy link
Contributor

rikublock commented Oct 17, 2024

Is there an existing issue for this problem?

  • I have searched the existing issues

Operating system

Linux, Ubuntu 22.04

GPU vendor

Nvidia (CUDA)

Version number

v5.1.1

Browser

Chromium 127.0.6533.119

What happened

Successfully installed the LoRA with the model manager. Added the LoRA when generating an image with flux (quantized). The generation fails.

What you expected to happen

Successfully generate an image.

Additional context

Discord username

riku_block

@rikublock rikublock added the bug Something isn't working label Oct 17, 2024
@freelancer2000
Copy link

Seems to be the same issue due to Quantized version. Have you tried the LORA with a non quantized checkpoint?

@RyanJDick
Copy link
Collaborator

AssertionError in add_qkv_lora_layer_if_present: assert all(keys_present) or not any(keys_present)

This issue was fixed a couple months ago in #7313

ValueError: Unsupported Linear LoRA layer type: <class 'invokeai.backend.lora.layers.lokr_layer.LoKRLayer'>

As of v5.6.0rc4, LoKR layers work on most base FLUX models. After #7577 is merged and deployed, LoKR layers should work on all supported FLUX models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants