1
0
mirror of https://github.com/huggingface/diffusers.git synced 2026-01-29 07:22:12 +03:00

Commit Graph

  • e9ea1c5b2c up sayakpaul 2025-10-06 10:47:12 +05:30
  • b0fc7af941 Merge branch 'main' into fa-hub Sayak Paul 2025-10-06 10:28:00 +05:30
  • ce90f9b2db [FIX] Text to image training peft version (#12434) SahilCarterr 2025-10-06 08:24:54 +05:30
  • c3675d4c9b [core] support QwenImage Edit Plus in modular (#12416) Sayak Paul 2025-10-05 21:57:13 +05:30
  • 2b7deffe36 fix scale_shift_factor being on cpu for wan and ltx (#12347) Vladimir Mandic 2025-10-04 23:53:38 -04:00
  • d53f848720 add transformer pipeline first version leffff 2025-10-04 10:10:23 +00:00
  • 0d3da485a0 up sayakpaul 2025-10-03 21:00:05 +05:30
  • 4f5e9a665e up sayakpaul 2025-10-03 20:49:50 +05:30
  • 23e5559c54 Merge branch 'main' into migrate-lora-pytest Sayak Paul 2025-10-03 20:44:52 +05:30
  • 941ac9c3d9 [training-scripts] Make more examples UV-compatible (follow up on #12000) (#12407) Linoy Tsaban 2025-10-03 17:46:47 +03:00
  • f8f27891c6 up sayakpaul 2025-10-03 20:14:45 +05:30
  • 128535cfcd up sayakpaul 2025-10-03 20:03:50 +05:30
  • bdc9537999 more fixtures. sayakpaul 2025-10-03 20:01:26 +05:30
  • dae161ed26 up sayakpaul 2025-10-03 17:39:55 +05:30
  • c5e9a4a648 Merge branch 'main' into qwen-pipeline-mixin Sayak Paul 2025-10-03 17:21:05 +05:30
  • 6734f5feb8 Merge branch 'main' into unbloat-pipeline-utilities Sayak Paul 2025-10-03 17:19:52 +05:30
  • c4bcf72084 up sayakpaul 2025-10-03 16:56:31 +05:30
  • 99308efb55 update DN6 2025-10-03 16:48:43 +05:30
  • 1737b710a2 up sayakpaul 2025-10-03 16:45:04 +05:30
  • 5015ce4fc7 update DN6 2025-10-03 16:44:23 +05:30
  • 565d674cc4 change flux lora integration tests to use pytest sayakpaul 2025-10-03 16:30:58 +05:30
  • 610842af1a up sayakpaul 2025-10-03 16:14:36 +05:30
  • cba82591e8 up sayakpaul 2025-10-03 15:56:37 +05:30
  • 949cc1c326 up sayakpaul 2025-10-03 14:54:23 +05:30
  • 5ed984cc47 update DN6 2025-10-03 14:42:58 +05:30
  • ec866f5de8 tempfile is now a fixture. sayakpaul 2025-10-03 14:25:54 +05:30
  • 7b4bcce602 up sayakpaul 2025-10-03 14:10:31 +05:30
  • d61bb38fb4 up sayakpaul 2025-10-03 13:14:05 +05:30
  • 9e92f6bb63 up sayakpaul 2025-10-03 12:53:37 +05:30
  • 6c6cade1a7 migrate lora pipeline tests to pytest sayakpaul 2025-10-03 12:52:56 +05:30
  • 77c4e0932c Merge branch 'main' into attn-refactor-blocks DN6 2025-10-03 11:39:22 +05:30
  • fc322ed052 update DN6 2025-10-03 11:38:16 +05:30
  • fed2c46482 update DN6 2025-10-03 11:36:21 +05:30
  • 66320f031a update DN6 2025-10-03 11:35:39 +05:30
  • 474b99597c Merge branch 'main' into fa-hub Sayak Paul 2025-10-03 11:25:37 +05:30
  • 7242b5ff62 FIX Test to ignore warning for enable_lora_hotswap (#12421) Benjamin Bossan 2025-10-02 20:57:11 +02:00
  • b4297967a0 [core] conditionally import torch distributed stuff. (#12420) Sayak Paul 2025-10-02 20:38:02 +05:30
  • 9ae5b6299d [ci] xfail failing tests in CI. (#12418) Sayak Paul 2025-10-02 17:46:15 +05:30
  • 046be83946 up sayakpaul 2025-10-02 15:43:44 +05:30
  • 814d710e56 [tests] cache non lora pipeline outputs. (#12298) Sayak Paul 2025-10-01 09:02:55 +05:30
  • cc5b31ffc9 [docs] Migrate syntax (#12390) Steven Liu 2025-09-30 10:11:19 -07:00
  • d7a1a0363f [docs] CP (#12331) Steven Liu 2025-09-30 09:33:41 -07:00
  • b59654544b Install latest prerelease from huggingface_hub when installing transformers from main (#12395) Lucain 2025-09-30 13:32:33 +02:00
  • 0e12ba7454 fix 3 xpu failures uts w/ latest pytorch (#12408) Yao Matrix 2025-09-30 01:37:48 -07:00
  • 20fd00b14b [Tests] Add single file tester mixin for Models and remove unittest dependency (#12352) Dhruv Nair 2025-09-30 09:58:34 +02:00
  • a957ea1f36 update sf-test-mixin DN6 2025-09-30 12:36:17 +05:30
  • 9938985426 update DN6 2025-09-30 12:28:19 +05:30
  • 6a47dd0c04 update DN6 2025-09-24 18:24:50 +05:30
  • 045f3ade68 Merge branch 'main' into sanitize-pipe-go-tests sanitize-pipe-go-tests Sayak Paul 2025-09-30 08:17:53 +05:30
  • 76d4e416bc [modular]some small fix (#12307) YiYi Xu 2025-09-29 11:42:34 -10:00
  • c07fcf780a [docs] Model formats (#12256) Steven Liu 2025-09-29 11:36:14 -07:00
  • ccedeca96e [docs] Distributed inference (#12285) Steven Liu 2025-09-29 11:24:26 -07:00
  • ae1c1d88ed Merge branch 'main' into maybe-fix-ci maybe-fix-ci Wauplin 2025-09-29 16:55:54 +02:00
  • c50715bbc6 works now Wauplin 2025-09-29 16:52:42 +02:00
  • 64a5187d96 [quantization] feat: support aobaseconfig classes in TorchAOConfig (#12275) Sayak Paul 2025-09-29 18:04:18 +05:30
  • 0a151115bb Fix #12116: preserve boolean dtype for attention masks in ChromaPipeline (#12263) Akshay Babbar 2025-09-29 14:20:05 +05:30
  • 19085ac8f4 Don't skip Qwen model tests for group offloading with disk (#12382) Sayak Paul 2025-09-29 13:08:05 +05:30
  • c1c0e9a481 update DN6 2025-09-29 12:35:55 +05:30
  • 862139d633 Merge branch 'main' into unbloat-pipeline-utilities sayakpaul 2025-09-28 17:29:28 +05:30
  • d29bfb78eb Merge branch 'main' into prompt-isolation-tests-qwen Sayak Paul 2025-09-28 16:48:36 +05:30
  • 290b28354d up sayakpaul 2025-09-28 16:33:56 +05:30
  • 16544bbec3 revert pipeline changes. sayakpaul 2025-09-28 16:23:23 +05:30
  • b26f7fc82f revert lora utils tests. sayakpaul 2025-09-28 16:22:34 +05:30
  • f84b0ab796 up sayakpaul 2025-09-28 16:19:17 +05:30
  • 1185f82450 up sayakpaul 2025-09-28 16:18:35 +05:30
  • a9d50c8f2a up sayakpaul 2025-09-26 22:42:52 +05:30
  • 041501aea9 [docs] remove docstrings from repeated methods in lora_pipeline.py (#12393) Sayak Paul 2025-09-26 22:38:43 +05:30
  • 1662890767 Merge branch 'main' into unbloat-docstrings unbloat-docstrings Sayak Paul 2025-09-26 21:51:56 +05:30
  • 9c0944581a [docs] slight edits to the attention backends docs. (#12394) Sayak Paul 2025-09-26 21:50:16 +05:30
  • 9dc99bb069 Merge branch 'main' into unbloat-docstrings Sayak Paul 2025-09-26 21:24:40 +05:30
  • 39216fc91c lru_cache for Python 3.8 Charles 2025-09-26 17:42:01 +02:00
  • 2ca3cadb35 [perf] Cache version checks Charles 2025-09-26 17:28:55 +02:00
  • f82c1523e5 fix prompt isolation test. sayakpaul 2025-09-26 18:50:26 +05:30
  • 4588bbeb42 [CI] disable installing transformers from main in ci for now. (#12397) Sayak Paul 2025-09-26 18:41:17 +05:30
  • f14dbc011a should be better Wauplin 2025-09-26 12:07:03 +02:00
  • 4767547e91 just bored Wauplin 2025-09-26 11:48:30 +02:00
  • 9522d34bea and now? Wauplin 2025-09-26 11:33:04 +02:00
  • 50befc6312 maybe better Wauplin 2025-09-26 11:14:52 +02:00
  • 8c07129182 maybe better Wauplin 2025-09-26 10:56:06 +02:00
  • fb6aae6476 Allow prerelease when installing transformers from main Wauplin 2025-09-26 10:49:49 +02:00
  • 1b96ed7df3 up sayakpaul 2025-09-26 11:10:00 +05:30
  • d9510862bf load_lora_into_transformer sayakpaul 2025-09-26 09:19:08 +05:30
  • 056fb8ad98 unfuse_lora sayakpaul 2025-09-26 09:12:27 +05:30
  • ca913f0db4 fuse_lora sayakpaul 2025-09-26 09:05:26 +05:30
  • 769c56af6f lora_state_dict sayakpaul 2025-09-26 09:00:42 +05:30
  • 1222b966d7 load_lora_weights() sayakpaul 2025-09-26 08:56:05 +05:30
  • 024932dd19 start unbloating docstrings (save_lora_weights). sayakpaul 2025-09-26 08:52:30 +05:30
  • ec5449f3a1 Support both huggingface_hub v0.x and v1.x (#12389) Lucain 2025-09-25 18:28:54 +02:00
  • 7187bbea46 Test hfh v1.0.0.rc2 ci-test-huggingface-hub-v1.0.0.rc2-release Hugging Face Bot (RC Testing) 2025-09-25 15:03:34 +00:00
  • 40e10daced setup.py ci-test-huggingface-hub-v1.0.0 Wauplin 2025-09-25 13:16:17 +02:00
  • 8851af80a2 this time? Wauplin 2025-09-25 12:00:52 +02:00
  • bbf5890a07 install compel separately Wauplin 2025-09-25 11:50:13 +02:00
  • 4eccafe490 tmp remove transformers from deps Wauplin 2025-09-25 11:27:41 +02:00
  • 2e08fd1bdd install transformers from main Wauplin 2025-09-25 11:19:03 +02:00
  • d252c02d1e support fa (2) through kernels. sayakpaul 2025-09-25 13:24:09 +05:30
  • c386f220ea up sayakpaul 2025-09-25 13:05:39 +05:30
  • fc87f40e7a Merge branch 'main' into qwen-pipeline-mixin Sayak Paul 2025-09-25 08:09:10 +05:30
  • 310fdaf556 Introduce cache-dit to community optimization (#12366) DefTruth 2025-09-25 01:50:57 +08:00
  • 80b9ad6c2d allow prerelease Wauplin 2025-09-24 16:31:34 +02:00
  • 09a251fef2 run on python 3.9 Wauplin 2025-09-24 16:17:02 +02:00