Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix]: Fallback to KleidiAI channelwise kernel groupsize isnt suitable #1647

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ng-05
Copy link
Contributor

@ng-05 ng-05 commented Jan 31, 2025

Description:

  1. Some models can have certain odd shapes which can not be used with blocked quantization. Fallback to channelwise quantization for those shapes.
  2. Fix Formatting issue with experimental tests

Description:
1. Some models can have certain odd shapes which can not be used with
   blocked quantization. Fallback to channelwise quantization for those
   shapes.
2. Fix Formatting issue with experimental tests

Signed-off-by: Nikhil Gupta <[email protected]>
Copy link

pytorch-bot bot commented Jan 31, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/1647

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 31, 2025
@ng-05
Copy link
Contributor Author

ng-05 commented Jan 31, 2025

@digantdesai @metascroy can you please help with reviewing this

if torch.backends.kleidiai.is_available():
if isinstance(granularity, PerGroup):
if weight.shape[-1] != group_size and group_size % 32 == 0:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weight.shape[-1] != group_size I don't understand this constraint.

IIRC the constraint is k % group_size == 0 and group_size % 32 == 0

@@ -628,15 +621,28 @@ def apply(weight, bias: Optional[torch.Tensor] = None):
assert (
TORCH_VERSION_AT_LEAST_2_6
), "aten target is requires torch version > 2.6.0"
# Fallback to Channelwise scheme if group_size is too big
if weight.shape[-1] < group_size:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd recommend adding a top-level configuration option which clearly tells the user "group size can be changed for certain weight shapes" to handle this case, and throwing an exception if that config setting isn't on.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand why we are doing a fallback here. If group_size doesn't divide weight.shape[-1], why not raise exception and let user explicitly move to channelwise?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants