Edna
1bd8fdfcb6
don't return length
2025-06-11 20:56:27 -06:00
Edna
406ab3b1e9
remove guidance from embeddings
2025-06-11 20:47:59 -06:00
Edna
e31c94866d
remove guidance embed (pipeline)
2025-06-11 20:46:59 -06:00
Edna
01bc0dcc56
remove guidance
2025-06-11 20:45:45 -06:00
Edna
e69d73099d
use DN6 embeddings
2025-06-11 20:05:28 -06:00
Edna
442f77a2d7
use chroma pipeline output
2025-06-11 19:59:43 -06:00
Edna
ab7942174a
use dn6 attn mask + fix true_cfg_scale
2025-06-11 19:57:31 -06:00
Edna
f6de1afc3f
update
2025-06-11 19:54:27 -06:00
Edna
f783f38883
ensure correct dtype for chroma embeddings
2025-06-11 19:52:43 -06:00
Edna
a3b6697bc3
Merge branch 'main' into chroma
2025-06-11 19:48:02 -06:00
Edna
68f771bf43
take pooled projections out of transformer
2025-06-11 19:38:38 -06:00
Edna
df7fde7a6d
fix load
2025-06-11 19:36:34 -06:00
Edna
77b429eda4
change to my own unpooled embeddeer
2025-06-11 19:35:10 -06:00
Edna
3309ffef1c
remove pooled prompt embeds
2025-06-11 19:33:17 -06:00
Edna
146255aba1
no attn mask (can't get it to work)
2025-06-11 19:17:29 -06:00
Edna
c9b46af65f
wrap attn mask
2025-06-11 19:16:24 -06:00
Edna
7c75d8e98d
dont modify mask (for now)
2025-06-11 19:15:18 -06:00
Edna
38429ffcac
remove mask function
2025-06-11 19:11:47 -06:00
Edna
f190c02af7
work on swapping text encoders
2025-06-11 19:09:37 -06:00
Edna
6c0aed14db
remove prompt_2
2025-06-11 19:06:45 -06:00
Edna
0b027a2453
swap embedder location
2025-06-11 19:04:52 -06:00
Edna
2fcc75a6d8
take out variant from blocks
2025-06-11 18:55:56 -06:00
Edna
af918c89dd
change to chroma transformer
2025-06-11 18:55:03 -06:00
Edna
7445cf422a
add chroma to pipeline init
2025-06-11 18:53:06 -06:00
Edna
a6f231c7ce
add chroma to auto pipeline
2025-06-11 18:51:45 -06:00
Edna
6441e70def
update
2025-06-11 18:48:44 -06:00
Edna
f0c75b6b6f
update
2025-06-11 18:46:51 -06:00
Edna
5eb4b822ae
fix single file
2025-06-11 18:38:58 -06:00
Edna
4e698b1088
add chroma to init
2025-06-11 18:21:10 -06:00
Edna
c22930d7cc
add chroma to init
2025-06-11 18:18:56 -06:00
Edna
7400278857
add chroma transformer to dummy tp
2025-06-11 18:16:44 -06:00
Tolga Cangöz
47ef79464f
Apply Occam's Razor in position embedding calculation ( #11562 )
...
* fix: remove redundant indexing
* style
2025-06-11 13:47:37 -10:00
Joel Schlosser
b272807bc8
Avoid DtoH sync from access of nonzero() item in scheduler ( #11696 )
2025-06-11 12:03:40 -10:00
rasmi
447ccd0679
Set _torch_version to N/A if torch is disabled. ( #11645 )
2025-06-11 11:59:54 -10:00
Aryan
f3e09114f2
Improve Wan docstrings ( #11689 )
...
* improve docstrings for wan
* Apply suggestions from code review
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* make style
---------
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2025-06-12 01:18:40 +05:30
Sayak Paul
91545666e0
[tests] model-level device_map clarifications ( #11681 )
...
* add clarity in documentation for device_map
* docs
* fix how compiler tester mixins are used.
* propagate
* more
* typo.
* fix tests
* fix order of decroators.
* clarify more.
* more test cases.
* fix doc
* fix device_map docstring in pipeline_utils.
* more examples
* more
* update
* remove code for stuff that is already supported.
* fix stuff.
2025-06-11 22:41:59 +05:30
Sayak Paul
b6f7933044
[tests] tests for compilation + quantization (bnb) ( #11672 )
...
* start adding compilation tests for quantization.
* fixes
* make common utility.
* modularize.
* add group offloading+compile
* xfail
* update
* Update tests/quantization/test_torch_compile_utils.py
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com >
* fixes
---------
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com >
2025-06-11 21:14:24 +05:30
Yao Matrix
33e636cea5
enable torchao test cases on XPU and switch to device agnostic APIs for test cases ( #11654 )
...
* enable torchao cases on XPU
Signed-off-by: Matrix YAO <matrix.yao@intel.com >
* device agnostic APIs
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
* more
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
* fix style
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
* enable test_torch_compile_recompilation_and_graph_break on XPU
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
* resolve comments
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
---------
Signed-off-by: Matrix YAO <matrix.yao@intel.com >
Signed-off-by: YAO Matrix <matrix.yao@intel.com >
2025-06-11 15:17:06 +05:30
Tolga Cangöz
e27142ac64
[Wan] Fix VAE sampling mode in WanVideoToVideoPipeline ( #11639 )
...
* fix: vae sampling mode
* fix a typo
2025-06-11 14:19:23 +05:30
Sayak Paul
8e88495da2
[LoRA] support Flux Control LoRA with bnb 8bit. ( #11655 )
...
support Flux Control LoRA with bnb 8bit.
2025-06-11 08:32:47 +05:30
Akash Haridas
b79803fe08
Allow remote code repo names to contain "." ( #11652 )
...
* allow loading from repo with dot in name
* put new arg at the end to avoid breaking compatibility
* add test for loading repo with dot in name
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2025-06-10 13:38:54 -10:00
Meatfucker
b0f7036d9a
Update pipeline_flux_inpaint.py to fix padding_mask_crop returning only the inpainted area ( #11658 )
...
* Update pipeline_flux_inpaint.py to fix padding_mask_crop returning only the inpainted area and not the entire image.
* Apply style fixes
* Update src/diffusers/pipelines/flux/pipeline_flux_inpaint.py
2025-06-10 13:07:22 -04:00
Edna
32659236b2
make chroma output class
2025-06-10 02:24:23 -06:00
Edna
c8cbb31614
add chroma init
2025-06-10 02:22:52 -06:00
Edna
b0df9691d2
get decently far in changing variant stuff
2025-06-10 02:09:52 -06:00
Edna
22ecd19f91
take out variant stuff
2025-06-09 21:32:52 -06:00
Edna
33ea0b65a4
add chroma to transformer init
2025-06-09 21:25:19 -06:00
Edna
bc36a0d883
add chroma to mappings
2025-06-09 21:15:19 -06:00
Edna
32e6a006cf
add chroma loader
2025-06-09 21:13:32 -06:00
Edna
15f2bd5c39
working state (embeddings)
2025-06-09 21:05:59 -06:00