1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00
Commit Graph

5971 Commits

Author SHA1 Message Date
sayakpaul
53a2a7aff5 fix nits. 2025-10-21 04:31:48 -10:00
David Bertoin
8de7b9247a Update tests/pipelines/photon/test_pipeline_photon.py
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
5c54baacb7 Update tests/pipelines/photon/test_pipeline_photon.py
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
9e8279e1fe restrict the version of transformers
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
aed1f19396 Use Tuple instead of tuple
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
DavidBert
1354f450e6 make fix-copies 2025-10-21 07:29:30 +00:00
David Bertoin
fdc8e34533 Add PhotonTransformer2DModel to TYPE_CHECKING imports 2025-10-21 07:29:30 +00:00
David Bertoin
d5ffd35d70 Update docs/source/en/api/pipelines/photon.md
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
adeb45e0b3 make fix copy 2025-10-21 07:29:30 +00:00
DavidBert
7d12474c24 naming changes 2025-10-21 07:29:30 +00:00
DavidBert
0ef0dc6837 use dispatch_attention_fn for multiple attention backend support 2025-10-21 07:29:30 +00:00
David Bertoin
836cd12a18 Update docs/source/en/api/pipelines/photon.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
caf64407cb Update docs/source/en/api/pipelines/photon.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
David Bertoin
c469a7a916 Update docs/source/en/api/pipelines/photon.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
2025-10-21 07:29:30 +00:00
DavidBert
9aa47ce6c3 added doc to toctree 2025-10-21 07:29:30 +00:00
DavidBert
a8fa52ba2a quantization example 2025-10-21 07:27:15 +00:00
David Bertoin
34a74928ac Update docs/source/en/api/pipelines/photon.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
2025-10-21 07:27:15 +00:00
DavidBert
0fdfd27ecb renaming and remove unecessary attributes setting 2025-10-21 07:27:15 +00:00
DavidBert
574f8fd10a parameter names match the standard diffusers conventions 2025-10-21 07:27:15 +00:00
DavidBert
6e05172682 Revert accidental .gitignore change 2025-10-21 07:27:15 +00:00
DavidBert
d0c029f15d built-in RMSNorm 2025-10-21 07:27:15 +00:00
DavidBert
015774399e Refactor PhotonAttention to match Flux pattern 2025-10-21 07:27:14 +00:00
DavidBert
5f99168def utility function that determines the default resolution given the VAE 2025-10-21 07:27:14 +00:00
DavidBert
c329c8f667 add pipeline test + corresponding fixes 2025-10-21 07:27:14 +00:00
David Bertoin
bb36735379 make quality + style 2025-10-21 07:27:14 +00:00
David Bertoin
8ee17d20b3 Use _import_structure for lazy loading 2025-10-21 07:27:14 +00:00
David Bertoin
be1d14658e add negative prompts 2025-10-21 07:27:14 +00:00
David Bertoin
c951adef45 move xattention conditionning out computation out of the denoising loop 2025-10-21 07:27:14 +00:00
David Bertoin
a74e0b726a support prompt_embeds in call 2025-10-21 07:27:14 +00:00
David Bertoin
ffe3501c1c rename vae_spatial_compression_ratio for vae_scale_factor 2025-10-21 07:27:14 +00:00
David Bertoin
2077252947 remove lora related code 2025-10-21 07:27:14 +00:00
David Bertoin
de1ceaf07a renam LastLayer for FinalLayer 2025-10-21 07:27:14 +00:00
David Bertoin
3c60c9230e put _attn_forward and _ffn_forward logic in PhotonBlock's forward 2025-10-21 07:27:14 +00:00
David Bertoin
af8882d7e6 remove modulation dataclass 2025-10-21 07:27:14 +00:00
David Bertoin
5f0bf0181f Rename EmbedND for PhotoEmbedND 2025-10-21 07:27:14 +00:00
davidb
ae44d845b6 remove lora support from doc 2025-10-21 07:27:14 +00:00
davidb
12dbabe607 fix timestep shift 2025-10-21 07:27:14 +00:00
davidb
ec70e3fdc0 fix T5Gemma loading from hub 2025-10-21 07:27:14 +00:00
davidb
6634113ef6 update doc 2025-10-21 07:27:14 +00:00
davidb
b07d1c8799 update doc 2025-10-21 07:27:14 +00:00
davidb
a9e301366a unify the structure of the forward block 2025-10-21 07:27:14 +00:00
davidb
5886925346 remove einops dependency and now inherits from AttentionMixin 2025-10-21 07:27:14 +00:00
davidb
25a0061d65 move PhotonAttnProcessor2_0 in transformer_photon 2025-10-21 07:27:14 +00:00
davidb
6284b9d062 remove enhance vae and use vae.config directly when possible 2025-10-21 07:27:14 +00:00
davidb
60d918d79b conditioned CFG 2025-10-21 07:27:14 +00:00
David Briand
b327b36ad9 BF16 example 2025-10-21 07:27:14 +00:00
davidb
14903ee599 remove autocast for text encoder forwad 2025-10-21 07:27:14 +00:00
davidb
d71ddd0079 enhance_vae_properties if vae is provided only 2025-10-21 07:27:14 +00:00
davidb
6a66fbd2c4 just store the T5Gemma encoder 2025-10-21 07:27:13 +00:00
davidb
e487660e05 Add Photon model and pipeline support
This commit adds support for the Photon image generation model:
- PhotonTransformer2DModel: Core transformer architecture
- PhotonPipeline: Text-to-image generation pipeline
- Attention processor updates for Photon-specific attention mechanism
- Conversion script for loading Photon checkpoints
- Documentation and tests
2025-10-21 07:27:13 +00:00