Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: huggingface/diffusers
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: main
Choose a base ref
...
head repository: sym-bot/diffusers
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: main
Choose a head ref
Checking mergeability… Don’t worry, you can still create the pull request.
  • 2 commits
  • 2 files changed
  • 2 contributors

Commits on Jan 31, 2026

  1. fix: graceful fallback for attention backend import failures

    Add try-except wrappers around all external attention backend imports
    (flash_attn, flash_attn_3, aiter, sageattention, flex_attention, torch_npu,
    torch_xla, xformers) to handle ABI mismatches and import failures gracefully.
    
    When an external attention backend fails to load (e.g., flash_attn compiled
    against a different PyTorch version), the code now logs a warning and falls
    back to native PyTorch SDPA instead of crashing.
    
    This fixes the "undefined symbol" errors that occur when flash_attn's compiled
    binaries are incompatible with the installed PyTorch version.
    
    See CHANGELOG.md for detailed documentation.
    
    Co-Authored-By: Claude Opus 4.5 <[email protected]>
    sym-bot and claude committed Jan 31, 2026
    Configuration menu
    Copy the full SHA
    3d2d148 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    c93a228 View commit details
    Browse the repository at this point in the history
Loading