1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00
Files
diffusers/docs/source/en/optimization/xformers.md
Sayak Paul 30e5e81d58 change to 2024 in the license (#6902)
change to 2024
2024-02-08 08:19:31 -10:00

1.6 KiB

xFormers

We recommend xFormers for both inference and training. In our tests, the optimizations performed in the attention blocks allow for both faster speed and reduced memory consumption.

Install xFormers from pip:

pip install xformers

The xFormers pip package requires the latest version of PyTorch. If you need to use a previous version of PyTorch, then we recommend installing xFormers from the source.

After xFormers is installed, you can use enable_xformers_memory_efficient_attention() for faster inference and reduced memory consumption as shown in this section.

According to this issue, xFormers v0.0.16 cannot be used for training (fine-tune or DreamBooth) in some GPUs. If you observe this problem, please install a development version as indicated in the issue comments.