1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00
Files
diffusers/tests
CalamitousFelicitousness edf36f5128 Add ZImage LoRA support and integrate into ZImagePipeline (#12750)
* Add ZImage LoRA support and integrate into ZImagePipeline

* Add LoRA test for Z-Image

* Move the LoRA test

* Fix ZImage LoRA scale support and test configuration

* Add ZImage LoRA test overrides for architecture differences

- Override test_lora_fuse_nan to use ZImage's 'layers' attribute
  instead of 'transformer_blocks'
- Skip block-level LoRA scaling test (not supported in ZImage)
- Add required imports: numpy, torch_device, check_if_lora_correctly_set

* Add ZImageLoraLoaderMixin to LoRA documentation

* Use conditional import for peft.LoraConfig in ZImage tests

* Override test_correct_lora_configs_with_different_ranks for ZImage

ZImage uses 'attention.to_k' naming convention instead of 'attn.to_k',
so the base test's module name search loop never finds a match. This
override uses the correct naming pattern for ZImage architecture.

* Add is_flaky decorator to ZImage LoRA tests initialise padding tokens

* Skip ZImage LoRA test class entirely

Skip the entire ZImageLoRATests class due to non-deterministic behavior
from complex64 RoPE operations and torch.empty padding tokens.
LoRA functionality works correctly with real models.

Clean up removed:
- Individual @unittest.skip decorators
- @is_flaky decorator overrides for inherited methods
- Custom test method overrides
- Global torch deterministic settings
- Unused imports (numpy, is_flaky, check_if_lora_correctly_set)

---------

Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
Co-authored-by: Álvaro Somoza <asomoza@users.noreply.github.com>
2025-12-02 02:16:30 -03:00
..
2025-11-30 20:27:59 -10:00
2025-11-30 20:27:59 -10:00