From 216d19017852e80a38dac4e42eb0759c96810313 Mon Sep 17 00:00:00 2001 From: Sayak Paul Date: Mon, 16 Jan 2023 09:16:54 +0100 Subject: [PATCH] Update README.md to include our blog post (#1998) * Update README.md Co-authored-by: Pedro Cuenca --- examples/dreambooth/README.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/examples/dreambooth/README.md b/examples/dreambooth/README.md index 2858c04c48..93e6f4d14b 100644 --- a/examples/dreambooth/README.md +++ b/examples/dreambooth/README.md @@ -321,3 +321,6 @@ python train_dreambooth_flax.py \ You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation. You can also use Dreambooth to train the specialized in-painting model. See [the script in the research folder for details](https://github.com/huggingface/diffusers/tree/main/examples/research_projects/dreambooth_inpaint). + +### Experimental results +You can refer to [this blog post](https://huggingface.co/blog/dreambooth) that discusses some of DreamBooth experiments in detail. Specifically, it recommends a set of DreamBooth-specific tips and tricks that we have found to work well for a variety of subjects.