1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-27 17:22:53 +03:00

Fix typos and add Typo check GitHub Action (#483)

* Fix typos

* Add a typo check action

* Fix a bug

* Changed to manual typo check currently

Ref: https://github.com/huggingface/diffusers/pull/483#pullrequestreview-1104468010

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* Removed a confusing message

* Renamed "nin_shortcut" to "in_shortcut"

* Add memo about NIN

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>
This commit is contained in:
Yuta Hayashibe
2022-09-16 22:36:51 +09:00
committed by GitHub
parent c0493723f7
commit 76d492ea49
38 changed files with 92 additions and 66 deletions

View File

@@ -44,7 +44,7 @@ To this end, the design of schedulers is such that:
The core API for any new scheduler must follow a limited structure.
- Schedulers should provide one or more `def step(...)` functions that should be called to update the generated sample iteratively.
- Schedulers should provide a `set_timesteps(...)` method that configures the parameters of a schedule function for a specific inference task.
- Schedulers should be framework-agonstic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch
- Schedulers should be framework-agnostic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch
with a `set_format(...)` method.
The base class [`SchedulerMixin`] implements low level utilities used by multiple schedulers.
@@ -53,7 +53,7 @@ The base class [`SchedulerMixin`] implements low level utilities used by multipl
[[autodoc]] SchedulerMixin
### SchedulerOutput
The class [`SchedulerOutput`] contains the ouputs from any schedulers `step(...)` call.
The class [`SchedulerOutput`] contains the outputs from any schedulers `step(...)` call.
[[autodoc]] schedulers.scheduling_utils.SchedulerOutput
@@ -71,7 +71,7 @@ Original paper can be found [here](https://arxiv.org/abs/2010.02502).
[[autodoc]] DDPMScheduler
#### Varience exploding, stochastic sampling from Karras et. al
#### Variance exploding, stochastic sampling from Karras et. al
Original paper can be found [here](https://arxiv.org/abs/2006.11239).