We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Linux, Ubuntu 22.04
Nvidia (CUDA)
v5.1.1
Chromium 127.0.6533.119
Successfully installed the LoRA with the model manager. Added the LoRA when generating an image with flux (quantized). The generation fails.
Successfully generate an image.
ValueError: Unsupported Linear LoRA layer type: <class 'invokeai.backend.lora.layers.lokr_layer.LoKRLayer'>
AssertionError in add_qkv_lora_layer_if_present: assert all(keys_present) or not any(keys_present)
riku_block
The text was updated successfully, but these errors were encountered:
Seems to be the same issue due to Quantized version. Have you tried the LORA with a non quantized checkpoint?
Sorry, something went wrong.
This issue was fixed a couple months ago in #7313
As of v5.6.0rc4, LoKR layers work on most base FLUX models. After #7577 is merged and deployed, LoKR layers should work on all supported FLUX models.
Successfully merging a pull request may close this issue.
Is there an existing issue for this problem?
Operating system
Linux, Ubuntu 22.04
GPU vendor
Nvidia (CUDA)
Version number
v5.1.1
Browser
Chromium 127.0.6533.119
What happened
Successfully installed the LoRA with the model manager. Added the LoRA when generating an image with flux (quantized). The generation fails.
What you expected to happen
Successfully generate an image.
Additional context
ValueError: Unsupported Linear LoRA layer type: <class 'invokeai.backend.lora.layers.lokr_layer.LoKRLayer'>
AssertionError in add_qkv_lora_layer_if_present: assert all(keys_present) or not any(keys_present)
Discord username
riku_block
The text was updated successfully, but these errors were encountered: