Skip to content

File "/usr/local/python/lib/python3.8/site-packages/diffusers/models/attention_processor.py", line 3074, in __call__ [rank0]: batch_size, key_tokens, _ = ( [rank0]: ValueError: too many values to unpack (expected 3) #11410

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
laoniandisko opened this issue Apr 24, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@laoniandisko
Copy link

laoniandisko commented Apr 24, 2025

Describe the bug

添加完xformer加速之后出现了这个问题

Reproduction

def __init__():

    self.unet = pipeline.unet
    self.set_diffusers_xformers_flag(self.unet,True)



def set_diffusers_xformers_flag( model, valid):

    def fn_recursive_set_mem_eff(module: torch.nn.Module):
        if hasattr(module, "set_use_memory_efficient_attention_xformers"):
            module.set_use_memory_efficient_attention_xformers(valid)
            print("="*100)
            print(hasattr(module, "set_use_memory_efficient_attention_xformers"))

        for child in module.children():
            fn_recursive_set_mem_eff(child)
    fn_recursive_set_mem_eff(model)

 def forward():
    self.attn1(
            mda_norm_hidden_states,
            encoder_hidden_states=encoder_hidden_states if self.only_cross_attention else None,
            attention_mask=attention_mask,
            **cross_attention_kwargs,
        )

Logs

ps:The rest of the code is not convenient to provide

System Info

  • 🤗 Diffusers version: 0.33.1
  • Platform: Linux-5.4.241-1-tlinux4-0017.7-x86_64-with-glibc2.2.5
  • Running on Google Colab?: No
  • Python version: 3.8.12
  • PyTorch version (GPU?): 2.4.0+cu118 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.30.2
  • Transformers version: 4.46.3
  • Accelerate version: 1.0.1
  • PEFT version: 0.13.2
  • Bitsandbytes version: 0.42.0
  • Safetensors version: 0.4.3
  • xFormers version: 0.0.27.post2+cu118
  • Accelerator: NVIDIA H20, 97871 MiB

Who can help?

No response

@laoniandisko laoniandisko added the bug Something isn't working label Apr 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant