mirror of
https://github.com/huggingface/diffusers.git
synced 2026-01-29 07:22:12 +03:00
* update
* update
* add coauthor
Co-Authored-By: Dhruv Nair <dhruv.nair@gmail.com>
* improve test
* handle ip adapter params correctly
* fix chroma qkv fusion test
* fix fastercache implementation
* fix more tests
* fight more tests
* add back set_attention_backend
* update
* update
* make style
* make fix-copies
* make ip adapter processor compatible with attention dispatcher
* refactor chroma as well
* remove rmsnorm assert
* minify and deprecate npu/xla processors
* update
* refactor
* refactor; support flash attention 2 with cp
* fix
* support sage attention with cp
* make torch compile compatible
* update
* refactor
* update
* refactor
* refactor
* add ulysses backward
* try to make dreambooth script work; accelerator backward not playing well
* Revert "try to make dreambooth script work; accelerator backward not playing well"
This reverts commit 768d0ea6fa.
* workaround compilation problems with triton when doing all-to-all
* support wan
* handle backward correctly
* support qwen
* support ltx
* make fix-copies
* Update src/diffusers/models/modeling_utils.py
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com>
* apply review suggestions
* update docs
* add explanation
* make fix-copies
* add docstrings
* support passing parallel_config to from_pretrained
* apply review suggestions
* make style
* update
* Update docs/source/en/api/parallel.md
Co-authored-by: Aryan <aryan@huggingface.co>
* up
---------
Co-authored-by: Dhruv Nair <dhruv.nair@gmail.com>
Co-authored-by: sayakpaul <spsayakpaul@gmail.com>
25 lines
914 B
Markdown
25 lines
914 B
Markdown
<!-- Copyright 2025 The HuggingFace Team. All rights reserved.
|
|
|
|
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
|
|
the License. You may obtain a copy of the License at
|
|
|
|
http://www.apache.org/licenses/LICENSE-2.0
|
|
|
|
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
|
|
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
|
specific language governing permissions and limitations under the License. -->
|
|
|
|
# Parallelism
|
|
|
|
Parallelism strategies help speed up diffusion transformers by distributing computations across multiple devices, allowing for faster inference/training times.
|
|
|
|
## ParallelConfig
|
|
|
|
[[autodoc]] ParallelConfig
|
|
|
|
## ContextParallelConfig
|
|
|
|
[[autodoc]] ContextParallelConfig
|
|
|
|
[[autodoc]] hooks.apply_context_parallel
|