Yondon Fu
8842bcadb9
[SVD] Return np.ndarray when output_type="np" ( #6507 )
...
[SVD] Fix output_type="np"
2024-01-16 14:51:36 +05:30
YiYi Xu
fefed44543
update slow test for SDXL k-diffusion pipeline ( #6588 )
...
update expected slice
2024-01-15 18:54:33 -10:00
jquintanilla4
da843b3d53
.load_ip_adapter in StableDiffusionXLAdapterPipeline ( #6246 )
...
* Added testing notebook and .load_ip_adapter to XLAdapterPipeline
* Added annotations
* deleted testing notebook
* Update src/diffusers/pipelines/t2i_adapter/pipeline_stable_diffusion_xl_adapter.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* code clean up
* Add feature_extractor and image_encoder to components
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com >
2024-01-11 13:04:08 +05:30
antoine-scenario
3e8b63216e
Add IP-Adapter to StableDiffusionXLControlNetImg2ImgPipeline ( #6293 )
...
* add IP-Adapter to StableDiffusionXLControlNetImg2ImgPipeline
Update src/diffusers/pipelines/controlnet/pipeline_controlnet_sd_xl_img2img.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
fix tests
* fix failing test
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-01-09 22:02:11 -10:00
YiYi Xu
6313645b6b
add StableDiffusionXLKDiffusionPipeline ( #6447 )
...
---------
Co-authored-by: yiyixuxu <yixu310@gmail,com>
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2024-01-09 16:29:01 -10:00
Patrick von Platen
5bacc2f5af
[SAG] Support more schedulers, add better error message and make tests faster ( #6465 )
...
* finish
* finish
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-01-09 09:24:38 +05:30
Sayak Paul
ae060fc4f1
[feat] introduce unload_lora(). ( #6451 )
...
* introduce unload_lora.
* fix-copies
2024-01-05 16:22:11 +05:30
Sayak Paul
0a0bb526aa
[LoRA depcrecation] LoRA depcrecation trilogy ( #6450 )
...
* edebug
* debug
* more debug
* more more debug
* remove tests for LoRAAttnProcessors.
* rename
2024-01-05 15:48:20 +05:30
Sayak Paul
107e02160a
[LoRA tests] fix stuff related to assertions arising from the recent changes. ( #6448 )
...
* debug
* debug test_with_different_scales_fusion_equivalence
* use the right method.
* place it right.
* let's see.
* let's see again
* alright then.
* add a comment.
2024-01-04 12:55:15 +05:30
sayakpaul
6dbef45e6e
Revert "debug"
...
This reverts commit 7715e6c31c .
2024-01-04 10:39:38 +05:30
sayakpaul
7715e6c31c
debug
2024-01-04 10:39:00 +05:30
sayakpaul
05b3d36a25
Revert "debug"
...
This reverts commit fb4aec0ce3 .
2024-01-04 10:38:04 +05:30
sayakpaul
fb4aec0ce3
debug
2024-01-04 10:37:28 +05:30
Sayak Paul
d700140076
[LoRA deprecation] handle rest of the stuff related to deprecated lora stuff. ( #6426 )
...
* handle rest of the stuff related to deprecated lora stuff.
* fix: copies
* don't modify the uNet in-place.
* fix: temporal autoencoder.
* manually remove lora layers.
* don't copy unet.
* alright
* remove lora attn processors from unet3d
* fix: unet3d.
* styl
* Empty-Commit
2024-01-03 20:54:09 +05:30
Sayak Paul
2e4dc3e25d
[LoRA] add: test to check if peft loras are loadable in non-peft envs. ( #6400 )
...
* add: test to check if peft loras are loadable in non-peft envs.
* add torch_device approrpiately.
* fix: get_dummy_inputs().
* test logits.
* rename
* debug
* debug
* fix: generator
* new assertion values after fixing the seed.
* shape
* remove print statements and settle this.
* to update values.
* change values when lora config is initialized under a fixed seed.
* update colab link
* update notebook link
* sanity restored by getting the exact same values without peft.
2024-01-03 09:57:49 +05:30
Fabio Rigano
86714b72d0
Add unload_ip_adapter method ( #6192 )
...
* Add unload_ip_adapter method
* Update attn_processors with original layers
* Add test
* Use set_default_attn_processor
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-01-02 14:40:46 +01:00
Sayak Paul
61f6c5472a
[LoRA] Remove the use of depcrecated loRA functionalities such as LoRAAttnProcessor ( #6369 )
...
* start deprecating loraattn.
* fix
* wrap into unet_lora_state_dict
* utilize text_encoder_lora_params
* utilize text_encoder_attn_modules
* debug
* debug
* remove print
* don't use text encoder for test_stable_diffusion_lora
* load the procs.
* set_default_attn_processor
* fix: set_default_attn_processor call.
* fix: lora_components[unet_lora_params]
* checking for 3d.
* 3d.
* more fixes.
* debug
* debug
* debug
* debug
* more debug
* more debug
* more debug
* more debug
* more debug
* more debug
* hack.
* remove comments and prep for a PR.
* appropriate set_lora_weights()
* fix
* fix: test_unload_lora_sd
* fix: test_unload_lora_sd
* use dfault attebtion processors.
* debu
* debug nan
* debug nan
* debug nan
* use NaN instead of inf
* remove comments.
* fix: test_text_encoder_lora_state_dict_unchanged
* attention processor default
* default attention processors.
* default
* style
2024-01-02 18:14:04 +05:30
Sayak Paul
6a376ceea2
[LoRA] remove unnecessary components from lora peft test suite ( #6401 )
...
remove unnecessary components from lora peft suite/
2023-12-30 18:25:40 +05:30
YiYi Xu
4c483deb90
[refactor embeddings] gligen + ip-adapter ( #6244 )
...
* refactor ip-adapter-imageproj, gligen
---------
Co-authored-by: yiyixuxu <yixu310@gmail,com>
2023-12-27 18:48:42 -10:00
Dhruv Nair
c1e8bdf1d4
Move ControlNetXS into Community Folder ( #6316 )
...
* update
* update
* update
* update
* update
* make style
* remove docs
* update
* move to research folder.
* fix-copies
* remove _toctree entry.
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-27 08:15:23 +05:30
Will Berman
0af12f1f8a
amused update links to new repo ( #6344 )
...
* amused update links to new repo
* lint
2023-12-26 22:46:28 +01:00
Justin Ruan
6e123688dc
Remove unused parameters and fixed FutureWarning ( #6317 )
...
* Remove unused parameters and fixed `FutureWarning`
* Fixed wrong config instance
* update unittest for `DDIMInverseScheduler`
2023-12-26 22:09:10 +01:00
Dhruv Nair
2026ec0a02
Interruptable Pipelines ( #5867 )
...
* add interruptable pipelines
* add tests
* updatemsmq
* add interrupt property
* make fix copies
* Revert "make fix copies"
This reverts commit 914b35332b .
* add docs
* add tutorial
* Update docs/source/en/tutorials/interrupting_diffusion_process.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update docs/source/en/tutorials/interrupting_diffusion_process.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* update
* fix quality issues
* fix
* update
---------
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-26 22:39:26 +05:30
dg845
3706aa3305
Add rescale_betas_zero_snr Argument to DDPMScheduler ( #6305 )
...
* Add rescale_betas_zero_snr argument to DDPMScheduler.
* Propagate rescale_betas_zero_snr changes to DDPMParallelScheduler.
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-26 17:54:30 +01:00
Younes Belkada
3aba99af8f
[Peft / Lora] Add adapter_names in fuse_lora ( #5823 )
...
* add adapter_name in fuse
* add tesrt
* up
* fix CI
* adapt from suggestion
* Update src/diffusers/utils/testing_utils.py
Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com >
* change to `require_peft_version_greater`
* change variable names in test
* Update src/diffusers/loaders/lora.py
Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com >
* break into 2 lines
* final comments
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com >
2023-12-26 16:54:47 +01:00
Sayak Paul
89459a5d56
fix: lora peft dummy components ( #6308 )
...
* fix: lora peft dummy components
* fix: dummy components
2023-12-25 11:26:45 +05:30
Dhruv Nair
fe574c8b29
LoRA Unfusion test fix ( #6291 )
...
update
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-24 14:31:48 +05:30
Sayak Paul
90b9479903
[LoRA PEFT] fix LoRA loading so that correct alphas are parsed ( #6225 )
...
* initialize alpha too.
* add: test
* remove config parsing
* store rank
* debug
* remove faulty test
2023-12-24 09:59:41 +05:30
Dhruv Nair
59d1caa238
Remove peft tests from old lora backend tests ( #6273 )
...
update
2023-12-22 13:35:52 +05:30
Dhruv Nair
c022e52923
Remove ONNX inpaint legacy ( #6269 )
...
update
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-22 13:35:21 +05:30
Will Berman
4039815276
open muse ( #5437 )
...
amused
rename
Update docs/source/en/api/pipelines/amused.md
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
AdaLayerNormContinuous default values
custom micro conditioning
micro conditioning docs
put lookup from codebook in constructor
fix conversion script
remove manual fused flash attn kernel
add training script
temp remove training script
add dummy gradient checkpointing func
clarify temperatures is an instance variable by setting it
remove additional SkipFF block args
hardcode norm args
rename tests folder
fix paths and samples
fix tests
add training script
training readme
lora saving and loading
non-lora saving/loading
some readme fixes
guards
Update docs/source/en/api/pipelines/amused.md
Co-authored-by: Suraj Patil <surajp815@gmail.com >
Update examples/amused/README.md
Co-authored-by: Suraj Patil <surajp815@gmail.com >
Update examples/amused/train_amused.py
Co-authored-by: Suraj Patil <surajp815@gmail.com >
vae upcasting
add fp16 integration tests
use tuple for micro cond
copyrights
remove casts
delegate to torch.nn.LayerNorm
move temperature to pipeline call
upsampling/downsampling changes
2023-12-21 11:40:55 -08:00
Benjamin Bossan
43979c2890
TST Fix LoRA test that fails with PEFT >= 0.7.0 ( #6216 )
...
See #6185 for context.
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-21 11:50:05 +01:00
Beinsezii
457abdf2cf
EulerAncestral add rescale_betas_zero_snr ( #6187 )
...
* EulerAncestral add `rescale_betas_zero_snr`
Uses same infinite sigma fix from EulerDiscrete. Interestingly the
ancestral version had the opposite problem: too much contrast instead of
too little.
* UT for EulerAncestral `rescale_betas_zero_snr`
* EulerAncestral upcast samples during step()
It helps this scheduler too, particularly when the model is using bf16.
While the noise dtype is still the model's it's automatically upcasted
for the add so all it affects is determinism.
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-20 13:09:25 +05:30
Dhruv Nair
32ff4773d4
ControlNetXS fixes. ( #6228 )
...
update
2023-12-19 11:58:34 +05:30
Sayak Paul
9221da4063
fix: init for vae during pixart tests ( #6215 )
...
* fix: init for vae during pixart tests
* print the values
* add flatten
* correct assertion value for test_inference
* correct assertion values for test_inference_non_square_images
* run styling
* debug test_inference_with_multiple_images_per_prompt
* fix assertion values for test_inference_with_multiple_images_per_prompt
2023-12-18 18:16:57 -10:00
YiYi Xu
57fde871e1
offload the optional module image_encoder ( #6151 )
...
* offload image_encoder
* add test
---------
Co-authored-by: yiyixuxu <yixu310@gmail,com>
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-18 15:10:01 -10:00
Dhruv Nair
781775ea56
Slow Test for Pipelines minor fixes ( #6221 )
...
update
2023-12-19 00:45:51 +05:30
Dhruv Nair
a0c54828a1
Deprecate Pipelines ( #6169 )
...
* deprecate pipe
* make style
* update
* add deprecation message
* format
* remove tests for deprecated pipelines
* remove deprecation message
* make style
* fix copies
* clean up
* clean
* clean
* clean
* clean up
* clean up
* clean up toctree
* clean up
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2023-12-18 23:08:29 +05:30
Patrick von Platen
cce1fe2d41
[Text-to-Video] Clean up pipeline ( #6213 )
...
* make style
* make style
* make style
* make style
2023-12-18 18:21:09 +01:00
Sayak Paul
2d94c7838e
[Core] feat: enable fused attention projections for other SD and SDXL pipelines ( #6179 )
...
* feat: enable fused attention projections for other SD and SDXL pipelines
* add: test for SD fused projections.
2023-12-16 08:45:54 +05:30
Dhruv Nair
f5dfe2a8b0
LoRA test fixes ( #6163 )
...
* update
* update
* update
* update
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-15 08:39:41 +05:30
Aryan V S
88bdd97ccd
IP adapter support for most pipelines ( #5900 )
...
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py
* update tests
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py
* support ip-adapter in src/diffusers/pipelines/latent_consistency_models/pipeline_latent_consistency_text2img.py
* support ip-adapter in src/diffusers/pipelines/latent_consistency_models/pipeline_latent_consistency_img2img.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py
* revert changes to sd_attend_and_excite and sd_upscale
* make style
* fix broken tests
* update ip-adapter implementation to latest
* apply suggestions from review
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-10 21:19:14 +05:30
Charchit Sharma
08b453e382
IP-Adapter for StableDiffusionControlNetImg2ImgPipeline ( #5901 )
...
* adapter for StableDiffusionControlNetImg2ImgPipeline
* fix-copies
* fix-copies
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-12-09 11:02:55 +05:30
Fabio Rigano
b65928b556
Add support for IPAdapterFull ( #5911 )
...
* Add support for IPAdapterFull
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com >
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2023-12-07 06:40:39 -10:00
Beinsezii
6bf1ca2c79
EulerDiscreteScheduler add rescale_betas_zero_snr ( #6024 )
...
* EulerDiscreteScheduler add `rescale_betas_zero_snr`
2023-12-06 21:51:04 -10:00
Dhruv Nair
79a7ab92d1
Fix clearing backend cache from device agnostic testing ( #6075 )
...
update
2023-12-07 11:18:31 +05:30
UmerHA
e192ae08d3
Add ControlNet-XS support ( #5827 )
...
* Check in 23-10-05
* check-in 23-10-06
* check-in 23-10-07 2pm
* check-in 23-10-08
* check-in 231009T1200
* check-in 230109
* checkin 231010
* init + forward run
* checkin
* checkin
* ControlNetXSModel is now saveable+loadable
* Forward works
* checkin
* Pipeline works with `no_control=True`
* checkin
* debug: save intermediate outputs of resnet
* checkin
* Understood time error + fixed connection error
* checkin
* checkin 231106T1600
* turned off detailled debug prints
* time debug logs
* small fix
* Separated control_scale for connections/time
* simplified debug logging
* Full denoising works with control scale = 0
* aligned logs
* Added control_attention_head_dim param
* Passing n_heads instead of dim_head into ctrl unet
* Fixed ctrl midblock bug
* Cleanup
* Fixed time dtype bug
* checkin
* 1. from_unet, 2. base passed, 3. all unet params
* checkin
* Finished docstrings
* cleanup
* make style
* checkin
* more tests pass
* Fixed tests
* removed debug logs
* make style + quality
* make fix-copies
* fixed documentation
* added cnxs to doc toc
* added control start/end param
* Update controlnetxs_sdxl.md
* tried to fix copies..
* Fixed norm_num_groups in from_unet
* added sdxl-depth test
* created SD2.1 controlnet-xs pipeline
* re-added debug logs
* Adjusting group norm ; readded logs
* Added debug log statements
* removed debug logs ; started tests for sd2.1
* updated sd21 tests
* fixed tests
* fixed tests
* slightly increased error tolerance for 1 test
* make style & quality
* Added docs for CNXS-SD
* make fix-copies
* Fixed sd compile test ; fixed gradient ckpointing
* vae downs = cnxs conditioning downs; removed guess
* make style & quality
* Fixed tests
* fixed test
* Incorporated review feedback
* simplified control model surgery
* fixed tests & make style / quality
* Updated docs; deleted pip & cursor files
* Rolled back minimal change to resnet
* Update resnet.py
* Update resnet.py
* Update src/diffusers/models/controlnetxs.py
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
* Update src/diffusers/models/controlnetxs.py
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
* Incorporated review feedback
* Update docs/source/en/api/pipelines/controlnetxs_sdxl.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update docs/source/en/api/pipelines/controlnetxs.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update docs/source/en/api/pipelines/controlnetxs.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update docs/source/en/api/pipelines/controlnetxs.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update src/diffusers/models/controlnetxs.py
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update src/diffusers/models/controlnetxs.py
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update src/diffusers/pipelines/controlnet_xs/pipeline_controlnet_xs.py
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update docs/source/en/api/pipelines/controlnetxs.md
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Update src/diffusers/pipelines/controlnet_xs/pipeline_controlnet_xs_sd_xl.py
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* Incorporated doc feedback
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com >
2023-12-06 23:33:47 +01:00
Sayak Paul
a2bc2e14b9
[feat] allow SDXL pipeline to run with fused QKV projections ( #6030 )
...
* debug
* from step
* print
* turn sigma a list
* make str
* init_noise_sigma
* comment
* remove prints
* feat: introduce fused projections
* change to a better name
* no grad
* device.
* device
* dtype
* okay
* print
* more print
* fix: unbind -> split
* fix: qkv >-> k
* enable disable
* apply attention processor within the method
* attn processors
* _enable_fused_qkv_projections
* remove print
* add fused projection to vae
* add todos.
* add: documentation and cleanups.
* add: test for qkv projection fusion.
* relax assertions.
* relax further
* fix: docs
* fix-copies
* correct error message.
* Empty-Commit
* better conditioning on disable_fused_qkv_projections
* check
* check processor
* bfloat16 computation.
* check latent dtype
* style
* remove copy temporarily
* cast latent to bfloat16
* fix: vae -> self.vae
* remove print.
* add _change_to_group_norm_32
* comment out stuff that didn't work
* Apply suggestions from code review
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
* reflect patrick's suggestions.
* fix imports
* fix: disable call.
* fix more
* fix device and dtype
* fix conditions.
* fix more
* Apply suggestions from code review
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2023-12-06 07:33:26 +05:30
Arsalan
f427345ab1
Device agnostic testing ( #5612 )
...
* utils and test modifications to enable device agnostic testing
* device for manual seed in unet1d
* fix generator condition in vae test
* consistency changes to testing
* make style
* add device agnostic testing changes to source and one model test
* make dtype check fns private, log cuda fp16 case
* remove dtype checks from import utils, move to testing_utils
* adding tests for most model classes and one pipeline
* fix vae import
2023-12-05 19:04:13 +05:30
Dhruv Nair
f948778322
Move kandinsky convert script ( #6047 )
...
move kandinsky convert script
2023-12-05 15:12:37 +05:30