Will Berman
2fd46405cd
consistency decoder ( #5694 )
...
* consistency decoder
* rename
* Apply suggestions from code review
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* Update src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py
* uP
* Apply suggestions from code review
* uP
* uP
* uP
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-11-09 12:21:41 +01:00
Dhruv Nair
2a8cf8e39f
Animatediff Proposal ( #5413 )
...
* draft design
* clean up
* clean up
* clean up
* clean up
* clean up
* clean up
* clean up
* clean up
* clean up
* update pipeline
* clean up
* clean up
* clean up
* add tests
* change motion block
* clean up
* clean up
* clean up
* update
* update
* update
* update
* update
* update
* update
* update
* clean up
* update
* update
* update model test
* update
* update
* update
* update
* make style
* update
* fix embeddings
* update
* merge upstream
* max fix copies
* fix bug
* fix mistake
* add docs
* update
* clean up
* update
* clean up
* clean up
* fix docstrings
* fix docstrings
* update
* update
* clean up
* update
2023-11-02 15:04:03 +01:00
Chengxi Guo
dcbfe662ef
fix typo ( #5505 )
...
Signed-off-by: mymusise <mymusise1@gmail.com >
2023-10-24 17:14:05 -07:00
Steven Liu
4ff7264d9b
[docs] PushToHubMixin ( #4622 )
...
* push to hub docs
* fix typo
* feedback
* make style
2023-08-16 13:20:59 -06:00
Sayak Paul
15782fd506
[Pipeline utils] feat: implement push_to_hub for standalone models, schedulers as well as pipelines ( #4128 )
...
* feat: implement push_to_hub for standalone models.
* address PR feedback.
* Apply suggestions from code review
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
* remove max_shard_size.
* add: support for scheduler push_to_hub
* enable push_to_hub support for flax schedulers.
* enable push_to_hub for pipelines.
* Apply suggestions from code review
Co-authored-by: Lucain <lucainp@gmail.com >
* reflect pr feedback.
* address another round of deedback.
* better handling of kwargs.
* add: tests
* Apply suggestions from code review
Co-authored-by: Lucain <lucainp@gmail.com >
* setting hub staging to False for now.
* incorporate staging test as a separate job.
Co-authored-by: ydshieh <2521628+ydshieh@users.noreply.github.com >
* fix: tokenizer loading.
* fix: json dumping.
* move is_staging_test to a better location.
* better treatment to tokens.
* define repo_id to better handle concurrency
* style
* explicitly set token
* Empty-Commit
* move SUER, TOKEN to test
* collate org_repo_id
* delete repo
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
Co-authored-by: Lucain <lucainp@gmail.com >
Co-authored-by: ydshieh <2521628+ydshieh@users.noreply.github.com >
2023-08-15 07:39:22 +05:30
Sayak Paul
18fc40c169
[Feat] add tiny Autoencoder for (almost) instant decoding ( #4384 )
...
* add: model implementation of tiny autoencoder.
* add: inits.
* push the latest devs.
* add: conversion script and finish.
* add: scaling factor args.
* debugging
* fix denormalization.
* fix: positional argument.
* handle use_torch_2_0_or_xformers.
* handle post_quant_conv
* handle dtype
* fix: sdxl image processor for tiny ae.
* fix: sdxl image processor for tiny ae.
* unify upcasting logic.
* copied from madness.
* remove trailing whitespace.
* set is_tiny_vae = False
* address PR comments.
* change to AutoencoderTiny
* make act_fn an str throughout
* fix: apply_forward_hook decorator call
* get rid of the special is_tiny_vae flag.
* directly scale the output.
* fix dummies?
* fix: act_fn.
* get rid of the Clamp() layer.
* bring back copied from.
* movement of the blocks to appropriate modules.
* add: docstrings to AutoencoderTiny
* add: documentation.
* changes to the conversion script.
* add doc entry.
* settle tests.
* style
* add one slow test.
* fix
* fix 2
* fix 2
* fix: 4
* fix: 5
* finish integration tests
* Apply suggestions from code review
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
* style
---------
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com >
2023-08-02 23:58:05 +05:30
camenduru
c6ae9b7df6
Where did this 'x' come from, Elon? ( #4277 )
...
* why mdx?
* why mdx?
* why mdx?
* no x for kandinksy either
---------
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2023-07-26 18:18:14 +02:00
Ruslan Vorovchenko
07f1fbb18e
Asymmetric vqgan ( #3956 )
...
* added AsymmetricAutoencoderKL
* fixed copies+dummy
* added script to convert original asymmetric vqgan
* added docs
* updated docs
* fixed style
* fixes, added tests
* update doc
* fixed doc
* fixed tests
* naming
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* naming
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
* udpated code example
* updated doc
* comments fixes
* added docstring
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
* comments fixes
* added inpaint pipeline tests
* comment suggestion: delete method
* yet another fixes
---------
Co-authored-by: Ruslan Vorovchenko <r.vorovchenko@prequelapp.com >
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com >
2023-07-20 17:51:06 +02:00
Patrick von Platen
6b1abba18d
Add controlnet and vae from single file ( #4084 )
...
* Add controlnet from single file
* Updates
* make style
* finish
* Apply suggestions from code review
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
---------
Co-authored-by: Sayak Paul <spsayakpaul@gmail.com >
2023-07-19 14:50:27 +02:00
Steven Liu
174dcd697f
[docs] Model API ( #3562 )
...
* add modelmixin and unets
* remove old model page
* minor fixes
* fix unet2dcondition
* add vqmodel and autoencoderkl
* add rest of models
* fix autoencoderkl path
* fix toctree
* fix toctree again
* apply feedback
* apply feedback
* fix copies
* fix controlnet copy
* fix copies
2023-06-29 17:24:39 -07:00