mirror of
https://github.com/huggingface/diffusers.git
synced 2026-01-27 17:22:53 +03:00
Mention training problems with xFormers 0.0.16 (#2254)
This commit is contained in:
@@ -27,3 +27,9 @@ The xFormers PIP package requires the latest version of PyTorch (1.13.1 as of xF
|
||||
</Tip>
|
||||
|
||||
After xFormers is installed, you can use `enable_xformers_memory_efficient_attention()` for faster inference and reduced memory consumption, as discussed [here](fp16#memory-efficient-attention).
|
||||
|
||||
<Tip warning={true}>
|
||||
|
||||
According to [this issue](https://github.com/huggingface/diffusers/issues/2234#issuecomment-1416931212), xFormers `v0.0.16` cannot be used for training (fine-tune or Dreambooth) in some GPUs. If you observe that problem, please install a development version as indicated in that comment.
|
||||
|
||||
</Tip>
|
||||
|
||||
@@ -235,5 +235,12 @@ python train_text_to_image_flax.py \
|
||||
--output_dir="sd-pokemon-model"
|
||||
```
|
||||
|
||||
### Training with xformers:
|
||||
You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation.
|
||||
### Training with xFormers:
|
||||
|
||||
You can enable memory efficient attention by [installing xFormers](https://huggingface.co/docs/diffusers/main/en/optimization/xformers) and passing the `--enable_xformers_memory_efficient_attention` argument to the script.
|
||||
|
||||
xFormers training is not available for Flax/JAX.
|
||||
|
||||
**Note**:
|
||||
|
||||
According to [this issue](https://github.com/huggingface/diffusers/issues/2234#issuecomment-1416931212), xFormers `v0.0.16` cannot be used for training in some GPUs. If you observe that problem, please install a development version as indicated in that comment.
|
||||
|
||||
Reference in New Issue
Block a user