fix(diffusers): drop compel from requirements to unblock pip resolver (#9632)

compel 2.3.1 (latest, Nov 2025) declares transformers~=4.25 in its
metadata, i.e. >=4.25,<5.0. After transformers 5.0 (2026-01-26) and
huggingface-hub 1.0 (2025-10-27) shipped, the weekly DEPS_REFRESH
cache rotation in CI started seeing the new majors and pip's resolver
went into multi-hour backtracking storms walking every transformers
4.x candidate against every accelerate/hf-hub/tokenizers combination
to find a set compel would accept. The 2026-04-29 backend-build for
the diffusers backend (darwin-mps + l4t + cublas13-turboquant matrix
cells) hit the GitHub Actions 6h job timeout still inside pip
install — the build itself never started.

compel is the only hard upper bound on transformers in this stack
(diffusers, accelerate, peft, optimum-quanto are all flexible), and
upstream support for transformers 5 is still in flight: damian0815/
compel#129 ("Modernize Compel for Transformers 5") and #128 ("Bump
transformers version to >5.0") are both open as of today.

backend.py only constructs Compel() when COMPEL=1 is set in the env
(default off), so make compel a true optional extra:

  - Wrap the top-level `from compel import ...` in try/except
    ImportError, mirroring the existing sd_embed pattern.
  - Auto-disable COMPEL with a warning when the module isn't
    installed, instead of crashing on module load.
  - Drop compel from all eight requirements-*.txt variants so the
    resolver no longer has to satisfy its transformers cap.
  - Leave a TODO in backend.py and in each requirements file
    pointing at the upstream PR/issue, so the dependency can be
    reinstated once compel supports transformers >= 5.

Users who rely on weighted-prompt embeddings can opt in with a
manual `pip install compel` alongside COMPEL=1; the warning emitted
on startup tells them how.

Assisted-by: Claude:claude-opus-4-7 [Bash Read Edit WebFetch]

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
Ettore Di Giacinto
2026-05-01 14:45:14 +02:00
committed by GitHub
parent 8452068f43
commit 7325046650
9 changed files with 68 additions and 13 deletions

View File

@@ -40,7 +40,19 @@ from diffusers import DiffusionPipeline, ControlNetModel
from diffusers import FluxPipeline, FluxTransformer2DModel, AutoencoderKLWan
from diffusers.pipelines.stable_diffusion import safety_checker
from diffusers.utils import load_image, export_to_video
from compel import Compel, ReturnedEmbeddingsType
# TODO: re-enable compel as a hard dependency once it supports transformers >= 5.
# Tracking upstream: https://github.com/damian0815/compel/pull/129
# and https://github.com/damian0815/compel/issues/128
# Until then compel pins transformers ~= 4.25, which forces the pip resolver into
# multi-hour backtracking storms in CI when DEPS_REFRESH rotates the cache.
# Keep the import optional and gate usage on the COMPEL env var (set COMPEL=1 to opt in).
try:
from compel import Compel, ReturnedEmbeddingsType
COMPEL_AVAILABLE = True
except ImportError:
Compel = None
ReturnedEmbeddingsType = None
COMPEL_AVAILABLE = False
from optimum.quanto import freeze, qfloat8, quantize
from transformers import T5EncoderModel
from safetensors.torch import load_file
@@ -66,6 +78,9 @@ from diffusers import LTX2VideoTransformer3DModel, GGUFQuantizationConfig
_ONE_DAY_IN_SECONDS = 60 * 60 * 24
COMPEL = os.environ.get("COMPEL", "0") == "1"
if COMPEL and not COMPEL_AVAILABLE:
print("WARNING: COMPEL is enabled but the compel module is not installed. Install it manually (`pip install compel`) or unset COMPEL. Falling back to standard prompt processing.", file=sys.stderr)
COMPEL = False
SD_EMBED = os.environ.get("SD_EMBED", "0") == "1"
# Warn if SD_EMBED is enabled but the module is not available
if SD_EMBED and not SD_EMBED_AVAILABLE: