Marc Sun
e4325606db
Fix loading sharded checkpoints when we have variants ( #9061 )
...
* Fix loading sharded checkpoint when we have variant
* add test
* remote print
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-08-06 13:38:44 -10:00
Vinh H. Pham
87e50a2f1d
[Tests] Improve transformers model test suite coverage - Hunyuan DiT ( #8916 )
...
* add hunyuan model test
* apply suggestions
* reduce dims further
* reduce dims further
* run make style
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-08-06 12:59:30 +05:30
Vinh H. Pham
e1d508ae92
[Tests] Improve transformers model test suite coverage - Latte ( #8919 )
...
* add LatteTransformer3DModel model test
* change patch_size to 1
* reduce req len
* reduce channel dims
* increase num_layers
* reduce dims further
* run make style
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
Co-authored-by: Aryan <aryan@huggingface.co >
2024-08-05 17:13:03 +05:30
Sayak Paul
0e460675e2
[Flux] allow tests to run ( #9050 )
...
* fix tests
* fix
* float64 skip
* remove sample_size.
* remove
* remove more
* default_sample_size.
* credit black forest for flux model.
* skip
* fix: tests
* remove OriginalModelMixin
* add transformer model test
* add: transformer model tests
2024-08-02 11:49:59 +05:30
YiYi Xu
95a7832879
fix load sharded checkpoint from a subfolder (local path) ( #8913 )
...
fix
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-08-01 20:15:42 +05:30
Yoach Lacombe
ea1b4ea7ca
Fix Stable Audio repository id ( #9016 )
...
Fix Stable Audio repo id
2024-07-30 23:17:44 +05:30
Yoach Lacombe
69e72b1dd1
Stable Audio integration ( #8716 )
...
* WIP modeling code and pipeline
* add custom attention processor + custom activation + add to init
* correct ProjectionModel forward
* add stable audio to __initèè
* add autoencoder and update pipeline and modeling code
* add half Rope
* add partial rotary v2
* add temporary modfis to scheduler
* add EDM DPM Solver
* remove TODOs
* clean GLU
* remove att.group_norm to attn processor
* revert back src/diffusers/schedulers/scheduling_dpmsolver_multistep.py
* refactor GLU -> SwiGLU
* remove redundant args
* add channel multiples in autoencoder docstrings
* changes in docsrtings and copyright headers
* clean pipeline
* further cleaning
* remove peft and lora and fromoriginalmodel
* Delete src/diffusers/pipelines/stable_audio/diffusers.code-workspace
* make style
* dummy models
* fix copied from
* add fast oobleck tests
* add brownian tree
* oobleck autoencoder slow tests
* remove TODO
* fast stable audio pipeline tests
* add slow tests
* make style
* add first version of docs
* wrap is_torchsde_available to the scheduler
* fix slow test
* test with input waveform
* add input waveform
* remove some todos
* create stableaudio gaussian projection + make style
* add pipeline to toctree
* fix copied from
* make quality
* refactor timestep_features->time_proj
* refactor joint_attention_kwargs->cross_attention_kwargs
* remove forward_chunk
* move StableAudioDitModel to transformers folder
* correct convert + remove partial rotary embed
* apply suggestions from yiyixuxu -> removing attn.kv_heads
* remove temb
* remove cross_attention_kwargs
* further removal of cross_attention_kwargs
* remove text encoder autocast to fp16
* continue removing autocast
* make style
* refactor how text and audio are embedded
* add paper
* update example code
* make style
* unify projection model forward + fix device placement
* make style
* remove fuse qkv
* apply suggestions from review
* Update src/diffusers/pipelines/stable_audio/pipeline_stable_audio.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* make style
* smaller models in fast tests
* pass sequential offloading fast tests
* add docs for vae and autoencoder
* make style and update example
* remove useless import
* add cosine scheduler
* dummy classes
* cosine scheduler docs
* better description of scheduler
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com >
2024-07-30 15:29:06 +05:30
Dhruv Nair
93983b6780
[CI] Skip flaky download tests in PR CI ( #8945 )
...
update
2024-07-24 09:25:06 +05:30
Vinh H. Pham
7a95f8d9d8
[Tests] Improve transformers model test suite coverage - Temporal Transformer ( #8932 )
...
* add test for temporal transformer
* remove unused variable
* fix code quality
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-07-23 15:36:30 +05:30
Sayak Paul
af400040f5
[Tests] proper skipping of request caching test ( #8908 )
...
proper skipping of request caching test
2024-07-22 12:52:57 -10:00
Sayak Paul
0f09b01ab3
[Core] fix: shard loading and saving when variant is provided. ( #8869 )
...
fix: shard loading and saving when variant is provided.
2024-07-17 08:26:28 +05:30
Sayak Paul
2261510bbc
[Core] Add AuraFlow ( #8796 )
...
* add lavender flow transformer
---------
Co-authored-by: YiYi Xu <yixu310@gmail.com >
2024-07-11 08:50:19 -10:00
Sayak Paul
a785992c1d
[Tests] fix more sharding tests ( #8797 )
...
* fix
* fix
* ugly
* okay
* fix more
* fix oops
2024-07-09 13:09:36 +05:30
Tolga Cangöz
57084dacc5
Remove unnecessary lines ( #8569 )
...
* Remove unused line
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-07-08 10:42:02 -10:00
YiYi Xu
9e9ed353a2
fix loading sharded checkpoints from subfolder ( #8798 )
...
* fix load sharded checkpoints from subfolder{
* style
* os.path.join
* add a small test
---------
Co-authored-by: sayakpaul <spsayakpaul@gmail.com >
2024-07-06 11:32:04 -10:00
Sayak Paul
31adeb41cd
[Tests] fix sharding tests ( #8764 )
...
fix sharding tests
2024-07-04 08:50:59 +05:30
Mathis Koroglu
3e0d128da7
Motion Model / Adapter versatility ( #8301 )
...
* Motion Model / Adapter versatility
- allow to use a different number of layers per block
- allow to use a different number of transformer per layers per block
- allow a different number of motion attention head per block
- use dropout argument in get_down/up_block in 3d blocks
* Motion Model added arguments renamed & refactoring
* Add test for asymmetric UNetMotionModel
2024-06-27 11:11:29 +05:30
Dhruv Nair
effe4b9784
Update xformers SD3 test ( #8712 )
...
update
2024-06-26 10:24:27 -10:00
Dhruv Nair
0f0b531827
Add decorator for compile tests ( #8703 )
...
* update
* update
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-06-26 11:26:47 +05:30
Sayak Paul
4ad7a1f5fd
[Chore] create a utility for calculating the expected number of shards. ( #8692 )
...
create a utility for calculating the expected number of shards.
2024-06-25 17:05:39 +05:30
Tolga Cangöz
c375903db5
Errata - Fix typos & improve contributing page ( #8572 )
...
* Fix typos & improve contributing page
* `make style && make quality`
* fix typos
* Fix typo
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-06-24 14:13:03 +05:30
YiYi Xu
c71c19c5e6
a few fix for shard checkpoints ( #8656 )
...
fix
Co-authored-by: yiyixuxu <yixu310@gmail,com>
2024-06-21 12:50:58 +05:30
Marc Sun
96399c3ec6
Fix sharding when no device_map is passed ( #8531 )
...
* Fix sharding when no device_map is passed
* style
* add tests
* align
* add docstring
* format
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-06-18 05:47:23 -10:00
Dhruv Nair
04717fd861
Add Stable Diffusion 3 ( #8483 )
...
* up
* add sd3
* update
* update
* add tests
* fix copies
* fix docs
* update
* add dreambooth lora
* add LoRA
* update
* update
* update
* update
* import fix
* update
* Update src/diffusers/pipelines/stable_diffusion_3/pipeline_stable_diffusion_3.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* import fix 2
* update
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* update
* update
* update
* fix ckpt id
* fix more ids
* update
* missing doc
* Update src/diffusers/schedulers/scheduling_flow_match_euler_discrete.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/schedulers/scheduling_flow_match_euler_discrete.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update docs/source/en/api/pipelines/stable_diffusion/stable_diffusion_3.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update docs/source/en/api/pipelines/stable_diffusion/stable_diffusion_3.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* update'
* fix
* update
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
* Update src/diffusers/models/autoencoders/autoencoder_kl.py
* note on gated access.
* requirements
* licensing
---------
Co-authored-by: sayakpaul <spsayakpaul@gmail.com >
Co-authored-by: YiYi Xu <yixu310@gmail.com >
2024-06-12 20:44:00 +01:00
Sayak Paul
7d887118b9
[Core] support saving and loading of sharded checkpoints ( #7830 )
...
* feat: support saving a model in sharded checkpoints.
* feat: make loading of sharded checkpoints work.
* add tests
* cleanse the loading logic a bit more.
* more resilience while loading from the Hub.
* parallelize shard downloads by using snapshot_download()/
* default to a shard size.
* more fix
* Empty-Commit
* debug
* fix
* uality
* more debugging
* fix more
* initial comments from Benjamin
* move certain methods to loading_utils
* add test to check if the correct number of shards are present.
* add a test to check if loading of sharded checkpoints from the Hub is okay
* clarify the unit when passed as an int.
* use hf_hub for sharding.
* remove unnecessary code
* remove unnecessary function
* lucain's comments.
* fixes
* address high-level comments.
* fix test
* subfolder shenanigans./
* Update src/diffusers/utils/hub_utils.py
Co-authored-by: Lucain <lucainp@gmail.com >
* Apply suggestions from code review
Co-authored-by: Lucain <lucainp@gmail.com >
* remove _huggingface_hub_version as not needed.
* address more feedback.
* add a test for local_files_only=True/
* need hf hub to be at least 0.23.2
* style
* final comment.
* clean up subfolder.
* deal with suffixes in code.
* _add_variant default.
* use weights_name_pattern
* remove add_suffix_keyword
* clean up downloading of sharded ckpts.
* don't return something special when using index.json
* fix more
* don't use bare except
* remove comments and catch the errors better
* fix a couple of things when using is_file()
* empty
---------
Co-authored-by: Lucain <lucainp@gmail.com >
2024-06-07 14:49:10 +05:30
Sayak Paul
a0542c1917
[LoRA] Remove legacy LoRA code and related adjustments ( #8316 )
...
* remove legacy code from load_attn_procs.
* finish first draft
* fix more.
* fix more
* add test
* add serialization support.
* fix-copies
* require peft backend for lora tests
* style
* fix test
* fix loading.
* empty
* address benjamin's feedback.
2024-06-05 08:15:30 +04:00
Sayak Paul
983dec3bf7
[Core] Introduce class variants for Transformer2DModel ( #7647 )
...
* init for patches
* finish patched model.
* continuous transformer
* vectorized transformer2d.
* style.
* inits.
* fix-copies.
* introduce DiTTransformer2DModel.
* fixes
* use REMAPPING as suggested by @DN6
* better logging.
* add pixart transformer model.
* inits.
* caption_channels.
* attention masking.
* fix use_additional_conditions.
* remove print.
* debug
* flatten
* fix: assertion for sigma
* handle remapping for modeling_utils
* add tests for dit transformer2d
* quality
* placeholder for pixart tests
* pixart tests
* add _no_split_modules
* add docs.
* check
* check
* check
* check
* fix tests
* fix tests
* move Transformer output to modeling_output
* move errors better and bring back use_additional_conditions attribute.
* add unnecessary things from DiT.
* clean up pixart
* fix remapping
* fix device_map things in pixart2d.
* replace Transformer2DModel with appropriate classes in dit, pixart tests
* empty
* legacy mixin classes./
* use a remapping dict for fetching class names.
* change to specifc model types in the pipeline implementations.
* move _fetch_remapped_cls_from_config to modeling_loading_utils.py
* fix dependency problems.
* add deprecation note.
2024-05-31 13:40:27 +05:30
Sayak Paul
ba1bfac20b
[Core] Refactor IPAdapterPlusImageProjection a bit ( #7994 )
...
* use IPAdapterPlusImageProjectionBlock in IPAdapterPlusImageProjection
* reposition IPAdapterPlusImageProjection
* refactor complete?
* fix heads param retrieval.
* update test dict creation method.
2024-05-29 06:30:47 +05:30
Dhruv Nair
baab065679
Remove unnecessary single file tests for SD Cascade UNet ( #7996 )
...
update
2024-05-22 12:29:59 +05:30
Isamu Isozaki
d27e996ccd
Adding VQGAN Training script ( #5483 )
...
* Init commit
* Removed einops
* Added default movq config for training
* Update explanation of prompts
* Fixed inheritance of discriminator and init_tracker
* Fixed incompatible api between muse and here
* Fixed output
* Setup init training
* Basic structure done
* Removed attention for quick tests
* Style fixes
* Fixed vae/vqgan styles
* Removed redefinition of wandb
* Fixed log_validation and tqdm
* Nothing commit
* Added commit loss to lookup_from_codebook
* Update src/diffusers/models/vq_model.py
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Adding perliminary README
* Fixed one typo
* Local changes
* Fixed main issues
* Merging
* Update src/diffusers/models/vq_model.py
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Testing+Fixed bugs in training script
* Some style fixes
* Added wandb to docs
* Fixed timm test
* get testing suite ready.
* remove return loss
* remove return_loss
* Remove diffs
* Remove diffs
* fix ruff format
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com >
2024-05-15 08:47:12 +05:30
Dhruv Nair
cb0f3b49cb
[Refactor] Better align from_single_file logic with from_pretrained ( #7496 )
...
* refactor unet single file loading a bit.
* retrieve the unet from create_diffusers_unet_model_from_ldm
* update
* update
* updae
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* tests
* update
* update
* update
* Update docs/source/en/api/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update docs/source/en/api/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* Update docs/source/en/api/loaders/single_file.md
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update src/diffusers/loaders/single_file.py
Co-authored-by: YiYi Xu <yixu310@gmail.com >
* Update docs/source/en/api/loaders/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update docs/source/en/api/loaders/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update docs/source/en/api/loaders/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update docs/source/en/api/loaders/single_file.md
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
* update
---------
Co-authored-by: sayakpaul <spsayakpaul@gmail.com >
Co-authored-by: YiYi Xu <yixu310@gmail.com >
2024-05-09 19:00:19 +05:30
HelloWorldBeginner
58237364b1
Add Ascend NPU support for SDXL fine-tuning and fix the model saving bug when using DeepSpeed. ( #7816 )
...
* Add Ascend NPU support for SDXL fine-tuning and fix the model saving bug when using DeepSpeed.
* fix check code quality
* Decouple the NPU flash attention and make it an independent module.
* add doc and unit tests for npu flash attention.
---------
Co-authored-by: mhh001 <mahonghao1@huawei.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-05-03 08:14:34 -10:00
Sayak Paul
8909ab4b19
[Tests] fix: device map tests for models ( #7825 )
...
* fix: device module tests
* remove patch file
* Empty-Commit
2024-05-01 18:45:47 +05:30
Sayak Paul
3fd31eef51
[Core] introduce _no_split_modules to ModelMixin ( #6396 )
...
* introduce _no_split_modules.
* unnecessary spaces.
* remove unnecessary kwargs and style
* fix: accelerate imports.
* change to _determine_device_map
* add the blocks that have residual connections.
* add: CrossAttnUpBlock2D
* add: testin
* style
* line-spaces
* quality
* add disk offload test without safetensors.
* checking disk offloading percentages.
* change model split
* add: utility for checking multi-gpu requirement.
* model parallelism test
* splits.
* splits.
* splits
* splits.
* splits.
* splits.
* offload folder to test_disk_offload_with_safetensors
* add _no_split_modules
* fix-copies
2024-04-30 08:46:51 +05:30
Sayak Paul
b833d0fc80
[Tests] mark UNetControlNetXSModelTests::test_forward_no_control to be flaky ( #7771 )
...
decorate UNetControlNetXSModelTests::test_forward_no_control with is_flaky
2024-04-25 07:29:04 +05:30
Dhruv Nair
9ef43f38d4
Fix test for consistency decoder. ( #7746 )
...
update
2024-04-24 12:28:11 +05:30
YiYi Xu
e5674015f3
adding back test_conversion_when_using_device_map ( #7704 )
...
* style
* Fix device map nits (#7705 )
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-04-18 19:21:32 -10:00
Fabio Rigano
b5c8b555d7
Move IP Adapter Face ID to core ( #7186 )
...
* Switch to peft and multi proj layers
* Move Face ID loading and inference to core
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-04-18 14:13:27 -10:00
UmerHA
fda1531d8a
Fixing implementation of ControlNet-XS ( #6772 )
...
* CheckIn - created DownSubBlocks
* Added extra channels, implemented subblock fwd
* Fixed connection sizes
* checkin
* Removed iter, next in forward
* Models for SD21 & SDXL run through
* Added back pipelines, cleared up connections
* Cleaned up connection creation
* added debug logs
* updated logs
* logs: added input loading
* Update umer_debug_logger.py
* log: Loading hint
* Update umer_debug_logger.py
* added logs
* Changed debug logging
* debug: added more logs
* Fixed num_norm_groups
* Debug: Logging all of SDXL input
* Update umer_debug_logger.py
* debug: updated logs
* checkim
* Readded tests
* Removed debug logs
* Fixed Slow Tests
* Added value ckecks | Updated model_cpu_offload_seq
* accelerate-offloading works ; fast tests work
* Made unet & addon explicit in controlnet
* Updated slow tests
* Added dtype/device to ControlNetXS
* Filled in test model paths
* Added image_encoder/feature_extractor to XL pipe
* Fixed fast tests
* Added comments and docstrings
* Fixed copies
* Added docs ; Updates slow tests
* Moved changes to UNetMidBlock2DCrossAttn
* tiny cleanups
* Removed stray prints
* Removed ip adapters + freeU
- Removed ip adapters + freeU as they don't make sense for ControlNet-XS
- Fixed imports of UNet components
* Fixed test_save_load_float16
* Make style, quality, fix-copies
* Changed loading/saving API for ControlNetXS
- Changed loading/saving API for ControlNetXS
- other small fixes
* Removed ControlNet-XS from research examples
* Make style, quality, fix-copies
* Small fixes
- deleted ControlNetXSModel.init_original
- added time_embedding_mix to StableDiffusionControlNetXSPipeline .from_pretrained / StableDiffusionXLControlNetXSPipeline.from_pretrained
- fixed copy hints
* checkin May 11 '23
* CheckIn Mar 12 '24
* Fixed tests for SD
* Added tests for UNetControlNetXSModel
* Fixed SDXL tests
* cleanup
* Delete Pipfile
* CheckIn Mar 20
Started replacing sub blocks by `ControlNetXSCrossAttnDownBlock2D` and `ControlNetXSCrossAttnUplock2D`
* check-in Mar 23
* checkin 24 Mar
* Created init for UNetCnxs and CnxsAddon
* CheckIn
* Made from_modules, from_unet and no_control work
* make style,quality,fix-copies & small changes
* Fixed freezing
* Added gradient ckpt'ing; fixed tests
* Fix slow tests(+compile) ; clear naming confusion
* Don't create UNet in init ; removed class_emb
* Incorporated review feedback
- Deleted get_base_pipeline / get_controlnet_addon for pipes
- Pipes inherit from StableDiffusionXLPipeline
- Made module dicts for cnxs-addon's down/mid/up classes
- Added support for qkv fusion and freeU
* Make style, quality, fix-copies
* Implemented review feedback
* Removed compatibility check for vae/ctrl embedding
* make style, quality, fix-copies
* Delete Pipfile
* Integrated review feedback
- Importing ControlNetConditioningEmbedding now
- get_down/mid/up_block_addon now outside class
- renamed `do_control` to `apply_control`
* Reduced size of test tensors
For this, added `norm_num_groups` as parameter everywhere
* Renamed cnxs-`Addon` to cnxs-`Adapter`
- `ControlNetXSAddon` -> `ControlNetXSAdapter`
- `ControlNetXSAddonDownBlockComponents` -> `DownBlockControlNetXSAdapter`, and similarly for mid/up
- `get_mid_block_addon` -> `get_mid_block_adapter`, and similarly for mid/up
* Fixed save_pretrained/from_pretrained bug
* Removed redundant code
---------
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com >
2024-04-16 21:56:20 +05:30
YiYi Xu
a341b536a8
disable test_conversion_when_using_device_map ( #7620 )
...
* disable test
* update
---------
Co-authored-by: yiyixuxu <yixu310@gmail,com>
2024-04-09 09:01:19 -10:00
Sayak Paul
1c60e094de
[Tests] reduce block sizes of UNet and VAE tests ( #7560 )
...
* reduce block sizes for unet1d.
* reduce blocks for unet_2d.
* reduce block size for unet_motion
* increase channels.
* correctly increase channels.
* reduce number of layers in unet2dconditionmodel tests.
* reduce block sizes for unet2dconditionmodel tests
* reduce block sizes for unet3dconditionmodel.
* fix: test_feed_forward_chunking
* fix: test_forward_with_norm_groups
* skip spatiotemporal tests on MPS.
* reduce block size in AutoencoderKL.
* reduce block sizes for vqmodel.
* further reduce block size.
* make style.
* Empty-Commit
* reduce sizes for ConsistencyDecoderVAETests
* further reduction.
* further block reductions in AutoencoderKL and AssymetricAutoencoderKL.
* massively reduce the block size in unet2dcontionmodel.
* reduce sizes for unet3d
* fix tests in unet3d.
* reduce blocks further in motion unet.
* fix: output shape
* add attention_head_dim to the test configuration.
* remove unexpected keyword arg
* up a bit.
* groups.
* up again
* fix
2024-04-05 10:08:32 +05:30
Dhruv Nair
4d39b7483d
Memory clean up on all Slow Tests ( #7514 )
...
* update
* update
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-03-29 14:23:28 +05:30
YiYi Xu
34c90dbb31
fix OOM for test_vae_tiling ( #7510 )
...
use float16 and add torch.no_grad()
2024-03-29 08:22:39 +05:30
M. Tolga Cangöz
443aa14e41
Fix Tiling in ConsistencyDecoderVAE ( #7290 )
...
* Fix typos
* Add docstring to `decode` method in `ConsistencyDecoderVAE`
* Fix tiling
* Enable tiled VAE decoding with customizable tile sample size and overlap factor
* Revert "Enable tiled VAE decoding with customizable tile sample size and overlap factor"
This reverts commit 181049675e .
* Add VAE tiling test for `ConsistencyDecoderVAE`
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-03-26 17:59:08 +05:30
Sayak Paul
484c8ef399
[tests] skip dynamo tests when python is 3.12. ( #7458 )
...
skip dynamo tests when python is 3.12.
2024-03-26 08:39:48 +05:30
M. Tolga Cangöz
a51b6cc86a
[Docs] Fix typos ( #7451 )
...
* Fix typos
* Fix typos
* Fix typos
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-03-25 11:48:02 -07:00
M. Tolga Cangöz
e97a633b63
Update access of configuration attributes ( #7343 )
...
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-03-18 08:53:29 -10:00
Dhruv Nair
4974b84564
Update Cascade Tests ( #7324 )
...
* update
* update
* update
2024-03-14 20:51:22 +05:30
Dhruv Nair
41424466e3
[Tests] Fix incorrect constant in VAE scaling test. ( #7301 )
...
update
2024-03-14 10:24:01 +05:30
Dhruv Nair
ed224f94ba
Add single file support for Stable Cascade ( #7274 )
...
* update
* update
* update
* update
* update
* update
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2024-03-13 08:37:31 +05:30