1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00
Files
diffusers/docs/source/en/api/attnprocessor.md
HelloWorldBeginner 58237364b1 Add Ascend NPU support for SDXL fine-tuning and fix the model saving bug when using DeepSpeed. (#7816)
* Add Ascend NPU support for SDXL fine-tuning and fix the model saving bug when using DeepSpeed.

* fix check code quality

* Decouple the NPU flash attention and make it an independent module.

* add doc and unit tests for npu flash attention.

---------

Co-authored-by: mhh001 <mahonghao1@huawei.com>
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
2024-05-03 08:14:34 -10:00

2.0 KiB

Attention Processor

An attention processor is a class for applying different types of attention mechanisms.

AttnProcessor

autodoc models.attention_processor.AttnProcessor

AttnProcessor2_0

autodoc models.attention_processor.AttnProcessor2_0

AttnAddedKVProcessor

autodoc models.attention_processor.AttnAddedKVProcessor

AttnAddedKVProcessor2_0

autodoc models.attention_processor.AttnAddedKVProcessor2_0

CrossFrameAttnProcessor

autodoc pipelines.text_to_video_synthesis.pipeline_text_to_video_zero.CrossFrameAttnProcessor

CustomDiffusionAttnProcessor

autodoc models.attention_processor.CustomDiffusionAttnProcessor

CustomDiffusionAttnProcessor2_0

autodoc models.attention_processor.CustomDiffusionAttnProcessor2_0

CustomDiffusionXFormersAttnProcessor

autodoc models.attention_processor.CustomDiffusionXFormersAttnProcessor

FusedAttnProcessor2_0

autodoc models.attention_processor.FusedAttnProcessor2_0

LoRAAttnAddedKVProcessor

autodoc models.attention_processor.LoRAAttnAddedKVProcessor

LoRAXFormersAttnProcessor

autodoc models.attention_processor.LoRAXFormersAttnProcessor

SlicedAttnProcessor

autodoc models.attention_processor.SlicedAttnProcessor

SlicedAttnAddedKVProcessor

autodoc models.attention_processor.SlicedAttnAddedKVProcessor

XFormersAttnProcessor

autodoc models.attention_processor.XFormersAttnProcessor

AttnProcessorNPU

autodoc models.attention_processor.AttnProcessorNPU