Skip to content

Default to using edge_program state dict if quantization data not available#17452

Open
JakeStevens wants to merge 1 commit intopytorch:mainfrom
JakeStevens:export-D93253828
Open

Default to using edge_program state dict if quantization data not available#17452
JakeStevens wants to merge 1 commit intopytorch:mainfrom
JakeStevens:export-D93253828

Conversation

@JakeStevens
Copy link
Contributor

@JakeStevens JakeStevens commented Feb 13, 2026

Summary:
PR #16901 added logic to replace FakeTensors with actually quantization data.

The docstring says:

        :param post_quantization_state_dict: State-dict of the model right after quantization. During partitioning, the
                                              `edge_program` only contains fake tensors without any data. In this case,
                                              this state dict is used instead (if provided). Notice: It may potentially
                                              contain outdated data,

with the key part being if provided. However, this is not the actual functionality.

By not falling back, this introduces a regression for depthwise convolutions, which have a "no dynamic" constraint that is triggered by this missing information.

This PR restores the original behavior if quantization data is not provided.

Differential Revision: D93253828

cc @robert-kalmar @digantdesai

…ilable

Summary:
PR pytorch#16901 added logic to replace FakeTensors with actually quantization data.

The docstring says:

```
        :param post_quantization_state_dict: State-dict of the model right after quantization. During partitioning, the
                                              `edge_program` only contains fake tensors without any data. In this case,
                                              this state dict is used instead (if provided). Notice: It may potentially
                                              contain outdated data,
```

with the key part being **if provided**.  However, this is not the actual functionality.

By not falling back, this introduces a regression for depthwise convolutions, which have a "no dynamic" constraint that is triggered by this missing information.

This PR restores the original behavior if quantization data is not provided.

Differential Revision: D93253828
@pytorch-bot
Copy link

pytorch-bot bot commented Feb 13, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/17452

Note: Links to docs will display an error until the docs builds have been completed.

❌ 3 New Failures, 1 Cancelled Job

As of commit 34e7a93 with merge base c12dc35 (image):

NEW FAILURES - The following jobs have failed:

CANCELLED JOB - The following job was cancelled. Please retry:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Feb 13, 2026
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Feb 13, 2026

@JakeStevens has exported this pull request. If you are a Meta employee, you can view the originating Diff in D93253828.

@JakeStevens JakeStevens added the module: nxp Issues related to NXP Neutron NPU delegation and code under backends/nxp/ label Feb 13, 2026
@github-actions
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported module: nxp Issues related to NXP Neutron NPU delegation and code under backends/nxp/

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant