1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00
Files
diffusers/examples
2022-06-21 11:21:10 +02:00
..
2022-06-21 10:38:34 +02:00

Training examples

Unconditional Flowers

The command to train a DDPM UNet model on the Oxford Flowers dataset:

python -m torch.distributed.launch \
  --nproc_per_node 4 \
  train_unconditional.py \
  --dataset="huggan/flowers-102-categories" \
  --resolution=64 \
  --output_path="flowers-ddpm" \
  --batch_size=16 \
  --num_epochs=100 \
  --gradient_accumulation_steps=1 \
  --lr=1e-4 \
  --warmup_steps=500 \
  --mixed_precision=no

A full training run takes 2 hours on 4xV100 GPUs.

Unconditional Pokemon

The command to train a DDPM UNet model on the Pokemon dataset:

python -m torch.distributed.launch \
  --nproc_per_node 4 \
  train_unconditional.py \
  --dataset="huggan/pokemon" \
  --resolution=64 \
  --output_path="pokemon-ddpm" \
  --batch_size=16 \
  --num_epochs=100 \
  --gradient_accumulation_steps=1 \
  --lr=1e-4 \
  --warmup_steps=500 \
  --mixed_precision=no

A full training run takes 2 hours on 4xV100 GPUs.