1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-29 07:22:12 +03:00
Files
diffusers/docs/source/zh/optimization/xformers.md
Sam Yuan bc2762cce9 try to use deepseek with an agent to auto i18n to zh (#12032)
* try to use deepseek with an agent to auto i18n to zh

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* add two more docs

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* fix, updated some prompt for better translation

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* Try to passs CI check

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* fix up for human review process

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* fix up

Signed-off-by: SamYuan1990 <yy19902439@126.com>

* fix review comments

Signed-off-by: SamYuan1990 <yy19902439@126.com>

---------

Signed-off-by: SamYuan1990 <yy19902439@126.com>
2025-08-13 08:26:24 -07:00

1.4 KiB
Raw Permalink Blame History

xFormers

我们推荐在推理和训练过程中使用xFormers。在我们的测试中,其对注意力模块的优化能同时提升运行速度并降低内存消耗。

通过pip安装xFormers

pip install xformers

xFormers的pip安装包需要最新版本的PyTorch。如需使用旧版PyTorch建议从源码安装xFormers

安装完成后,您可调用enable_xformers_memory_efficient_attention()来实现更快的推理速度和更低的内存占用,具体用法参见此章节

根据此问题反馈xFormers v0.0.16版本在某些GPU上无法用于训练微调或DreamBooth。如遇此问题请按照该issue评论区指引安装开发版本。