David Bertoin
d5ffd35d70
Update docs/source/en/api/pipelines/photon.md
...
Co-authored-by: dg845 <58458699+dg845@users.noreply.github.com >
2025-10-21 07:29:30 +00:00
David Bertoin
adeb45e0b3
make fix copy
2025-10-21 07:29:30 +00:00
DavidBert
7d12474c24
naming changes
2025-10-21 07:29:30 +00:00
DavidBert
0ef0dc6837
use dispatch_attention_fn for multiple attention backend support
2025-10-21 07:29:30 +00:00
David Bertoin
836cd12a18
Update docs/source/en/api/pipelines/photon.md
...
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2025-10-21 07:29:30 +00:00
David Bertoin
caf64407cb
Update docs/source/en/api/pipelines/photon.md
...
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2025-10-21 07:29:30 +00:00
David Bertoin
c469a7a916
Update docs/source/en/api/pipelines/photon.md
...
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2025-10-21 07:29:30 +00:00
DavidBert
9aa47ce6c3
added doc to toctree
2025-10-21 07:29:30 +00:00
DavidBert
a8fa52ba2a
quantization example
2025-10-21 07:27:15 +00:00
David Bertoin
34a74928ac
Update docs/source/en/api/pipelines/photon.md
...
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2025-10-21 07:27:15 +00:00
DavidBert
0fdfd27ecb
renaming and remove unecessary attributes setting
2025-10-21 07:27:15 +00:00
DavidBert
574f8fd10a
parameter names match the standard diffusers conventions
2025-10-21 07:27:15 +00:00
DavidBert
6e05172682
Revert accidental .gitignore change
2025-10-21 07:27:15 +00:00
DavidBert
d0c029f15d
built-in RMSNorm
2025-10-21 07:27:15 +00:00
DavidBert
015774399e
Refactor PhotonAttention to match Flux pattern
2025-10-21 07:27:14 +00:00
DavidBert
5f99168def
utility function that determines the default resolution given the VAE
2025-10-21 07:27:14 +00:00
DavidBert
c329c8f667
add pipeline test + corresponding fixes
2025-10-21 07:27:14 +00:00
David Bertoin
bb36735379
make quality + style
2025-10-21 07:27:14 +00:00
David Bertoin
8ee17d20b3
Use _import_structure for lazy loading
2025-10-21 07:27:14 +00:00
David Bertoin
be1d14658e
add negative prompts
2025-10-21 07:27:14 +00:00
David Bertoin
c951adef45
move xattention conditionning out computation out of the denoising loop
2025-10-21 07:27:14 +00:00
David Bertoin
a74e0b726a
support prompt_embeds in call
2025-10-21 07:27:14 +00:00
David Bertoin
ffe3501c1c
rename vae_spatial_compression_ratio for vae_scale_factor
2025-10-21 07:27:14 +00:00
David Bertoin
2077252947
remove lora related code
2025-10-21 07:27:14 +00:00
David Bertoin
de1ceaf07a
renam LastLayer for FinalLayer
2025-10-21 07:27:14 +00:00
David Bertoin
3c60c9230e
put _attn_forward and _ffn_forward logic in PhotonBlock's forward
2025-10-21 07:27:14 +00:00
David Bertoin
af8882d7e6
remove modulation dataclass
2025-10-21 07:27:14 +00:00
David Bertoin
5f0bf0181f
Rename EmbedND for PhotoEmbedND
2025-10-21 07:27:14 +00:00
davidb
ae44d845b6
remove lora support from doc
2025-10-21 07:27:14 +00:00
davidb
12dbabe607
fix timestep shift
2025-10-21 07:27:14 +00:00
davidb
ec70e3fdc0
fix T5Gemma loading from hub
2025-10-21 07:27:14 +00:00
davidb
6634113ef6
update doc
2025-10-21 07:27:14 +00:00
davidb
b07d1c8799
update doc
2025-10-21 07:27:14 +00:00
davidb
a9e301366a
unify the structure of the forward block
2025-10-21 07:27:14 +00:00
davidb
5886925346
remove einops dependency and now inherits from AttentionMixin
2025-10-21 07:27:14 +00:00
davidb
25a0061d65
move PhotonAttnProcessor2_0 in transformer_photon
2025-10-21 07:27:14 +00:00
davidb
6284b9d062
remove enhance vae and use vae.config directly when possible
2025-10-21 07:27:14 +00:00
davidb
60d918d79b
conditioned CFG
2025-10-21 07:27:14 +00:00
David Briand
b327b36ad9
BF16 example
2025-10-21 07:27:14 +00:00
davidb
14903ee599
remove autocast for text encoder forwad
2025-10-21 07:27:14 +00:00
davidb
d71ddd0079
enhance_vae_properties if vae is provided only
2025-10-21 07:27:14 +00:00
davidb
6a66fbd2c4
just store the T5Gemma encoder
2025-10-21 07:27:13 +00:00
davidb
e487660e05
Add Photon model and pipeline support
...
This commit adds support for the Photon image generation model:
- PhotonTransformer2DModel: Core transformer architecture
- PhotonPipeline: Text-to-image generation pipeline
- Attention processor updates for Photon-specific attention mechanism
- Conversion script for loading Photon checkpoints
- Documentation and tests
2025-10-21 07:27:13 +00:00
Steven Liu
5b5fa49a89
[docs] Organize toctree by modality ( #12514 )
...
* reorganize
* fix
---------
Co-authored-by: Álvaro Somoza <asomoza@users.noreply.github.com >
2025-10-21 10:18:54 +05:30
Fei Xie
decfa3c9e1
Fix: Use incorrect temporary variable key when replacing adapter name… ( #12502 )
...
Fix: Use incorrect temporary variable key when replacing adapter name in state dict within load_lora_adapter function
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2025-10-20 15:45:37 -10:00
Dhruv Nair
48305755bf
Raise warning instead of error when imports are missing for custom code ( #12513 )
...
update
2025-10-20 07:02:23 -10:00
dg845
7853bfbed7
Remove Qwen Image Redundant RoPE Cache ( #12452 )
...
Refactor QwenEmbedRope to only use the LRU cache for RoPE caching
2025-10-19 18:41:58 -07:00
Lev Novitskiy
23ebbb4bc8
Kandinsky 5 is finally in Diffusers! ( #12478 )
...
* add kandinsky5 transformer pipeline first version
---------
Co-authored-by: Álvaro Somoza <asomoza@users.noreply.github.com >
Co-authored-by: YiYi Xu <yixu310@gmail.com >
Co-authored-by: Charles <charles@huggingface.co >
2025-10-17 18:34:30 -10:00
Ali Imran
1b456bd5d5
docs: cleanup of runway model ( #12503 )
...
* cleanup of runway model
* quality fixes
2025-10-17 14:10:50 -07:00
Sayak Paul
af769881d3
[tests] introduce VAETesterMixin to consolidate tests for slicing and tiling ( #12374 )
...
* up
* up
* up
* up
* up
* u[
* up
* up
* up
2025-10-17 12:02:29 +05:30