Kohya S
|
c142dadb46
|
support sai model spec
|
2023-08-06 21:50:05 +09:00 |
|
Kohya S
|
e5f9772a35
|
fix training textencoder in sdxl not working
|
2023-08-05 21:22:50 +09:00 |
|
Kohya S
|
9d855091bf
|
make bitsandbytes optional
|
2023-08-04 22:29:14 +09:00 |
|
Kohya S
|
f3be995c28
|
remove debug print
|
2023-08-04 08:44:17 +09:00 |
|
Kohya S
|
9d7619d1eb
|
remove debug print
|
2023-08-04 08:42:54 +09:00 |
|
Kohya S
|
c6d52fdea4
|
Add workaround for clip's bug for pooled output
|
2023-08-04 08:38:27 +09:00 |
|
Kohya S
|
0636399c8c
|
add adding v-pred like loss for noise pred
|
2023-07-31 08:23:28 +09:00 |
|
Kohya S
|
f61996b425
|
remove dependency for albumenations
|
2023-07-30 16:29:53 +09:00 |
|
Kohya S
|
496c3f2732
|
arbitrary args for diffusers lr scheduler
|
2023-07-30 14:36:03 +09:00 |
|
Kohya S
|
a296654c1b
|
refactor optimizer selection for bnb
|
2023-07-30 13:43:29 +09:00 |
|
Kohya S
|
b62185b821
|
change method name, add comments
|
2023-07-30 13:34:07 +09:00 |
|
Kohya S
|
e6034b7eb6
|
move releasing cache outside of the loop
|
2023-07-30 13:30:42 +09:00 |
|
青龍聖者@bdsqlsz
|
9ec70252d0
|
Add Paged/ adam8bit/lion8bit for Sdxl bitsandbytes 0.39.1 cuda118 on windows (#623)
* ADD libbitsandbytes.dll for 0.38.1
* Delete libbitsandbytes_cuda116.dll
* Delete cextension.py
* add main.py
* Update requirements.txt for bitsandbytes 0.38.1
* Update README.md for bitsandbytes-windows
* Update README-ja.md for bitsandbytes 0.38.1
* Update main.py for return cuda118
* Update train_util.py for lion8bit
* Update train_README-ja.md for lion8bit
* Update train_util.py for add DAdaptAdan and DAdaptSGD
* Update train_util.py for DAdaptadam
* Update train_network.py for dadapt
* Update train_README-ja.md for DAdapt
* Update train_util.py for DAdapt
* Update train_network.py for DAdaptAdaGrad
* Update train_db.py for DAdapt
* Update fine_tune.py for DAdapt
* Update train_textual_inversion.py for DAdapt
* Update train_textual_inversion_XTI.py for DAdapt
* Revert "Merge branch 'qinglong' into main"
This reverts commit b65c023083, reversing
changes made to f6fda20caf.
* Revert "Update requirements.txt for bitsandbytes 0.38.1"
This reverts commit 83abc60dfa.
* Revert "Delete cextension.py"
This reverts commit 3ba4dfe046.
* Revert "Update README.md for bitsandbytes-windows"
This reverts commit 4642c52086.
* Revert "Update README-ja.md for bitsandbytes 0.38.1"
This reverts commit fa6d7485ac.
* Update train_util.py for DAdaptLion
* Update train_README-zh.md for dadaptlion
* Update train_README-ja.md for DAdaptLion
* add DAdatpt V3
* Alignment
* Update train_util.py for experimental
* Update train_util.py V3
* Update train_util.py
* Update requirements.txt
* Update train_README-zh.md
* Update train_README-ja.md
* Update train_util.py fix
* Update train_util.py
* support Prodigy
* add lower
* Update main.py
* support PagedAdamW8bit/PagedLion8bit
* Update requirements.txt
* update for PageAdamW8bit and PagedLion8bit
* Revert
* revert main
* Update train_util.py
* update for bitsandbytes 0.39.1
* Update requirements.txt
* vram leak fix
---------
Co-authored-by: Pam <pamhome21@gmail.com>
|
2023-07-30 13:15:13 +09:00 |
|
Kohya S
|
e20b6acfe9
|
Merge pull request #676 from Isotr0py/sdxl
Fix RAM leak when loading SDXL model in lowram device
|
2023-07-30 12:46:23 +09:00 |
|
Isotr0py
|
d9180c03f6
|
fix typos for _load_state_dict
|
2023-07-29 22:25:00 +08:00 |
|
Kohya S
|
4072f723c1
|
Merge branch 'main' into sdxl
|
2023-07-29 14:55:03 +09:00 |
|
Kohya S
|
1e4512b2c8
|
support ckpt without position id in sd v1 #687
|
2023-07-29 14:19:25 +09:00 |
|
Isotr0py
|
1199eacb72
|
fix typo
|
2023-07-28 13:49:37 +08:00 |
|
Isotr0py
|
fdb58b0b62
|
fix mismatch dtype
|
2023-07-28 13:47:54 +08:00 |
|
Isotr0py
|
315fbc11e5
|
refactor model loading to catch error
|
2023-07-28 13:10:38 +08:00 |
|
Isotr0py
|
272dd993e6
|
Merge branch 'sdxl' into sdxl
|
2023-07-28 10:19:37 +08:00 |
|
Isotr0py
|
96a52d9810
|
add dtype to u-net loading
|
2023-07-27 23:58:25 +08:00 |
|
Isotr0py
|
50544b7805
|
fix pipeline dtype
|
2023-07-27 23:16:58 +08:00 |
|
Kohya S
|
b78c0e2a69
|
remove unused func
|
2023-07-25 19:07:26 +09:00 |
|
Isotr0py
|
eec6aaddda
|
fix safetensors error: device invalid
|
2023-07-23 13:29:29 +08:00 |
|
Isotr0py
|
bb167f94ca
|
init unet with empty weights
|
2023-07-23 13:17:11 +08:00 |
|
Kohya S
|
50b53e183e
|
re-organize import
|
2023-07-23 13:33:02 +09:00 |
|
青龍聖者@bdsqlsz
|
d131bde183
|
Support for bitsandbytes 0.39.1 with Paged Optimizer(AdamW8bit and Lion8bit) (#631)
* ADD libbitsandbytes.dll for 0.38.1
* Delete libbitsandbytes_cuda116.dll
* Delete cextension.py
* add main.py
* Update requirements.txt for bitsandbytes 0.38.1
* Update README.md for bitsandbytes-windows
* Update README-ja.md for bitsandbytes 0.38.1
* Update main.py for return cuda118
* Update train_util.py for lion8bit
* Update train_README-ja.md for lion8bit
* Update train_util.py for add DAdaptAdan and DAdaptSGD
* Update train_util.py for DAdaptadam
* Update train_network.py for dadapt
* Update train_README-ja.md for DAdapt
* Update train_util.py for DAdapt
* Update train_network.py for DAdaptAdaGrad
* Update train_db.py for DAdapt
* Update fine_tune.py for DAdapt
* Update train_textual_inversion.py for DAdapt
* Update train_textual_inversion_XTI.py for DAdapt
* Revert "Merge branch 'qinglong' into main"
This reverts commit b65c023083, reversing
changes made to f6fda20caf.
* Revert "Update requirements.txt for bitsandbytes 0.38.1"
This reverts commit 83abc60dfa.
* Revert "Delete cextension.py"
This reverts commit 3ba4dfe046.
* Revert "Update README.md for bitsandbytes-windows"
This reverts commit 4642c52086.
* Revert "Update README-ja.md for bitsandbytes 0.38.1"
This reverts commit fa6d7485ac.
* Update train_util.py
* Update requirements.txt
* support PagedAdamW8bit/PagedLion8bit
* Update requirements.txt
* update for PageAdamW8bit and PagedLion8bit
* Revert
* revert main
|
2023-07-22 19:45:32 +09:00 |
|
Kohya S
|
73a08c0be0
|
Merge pull request #630 from ddPn08/sdxl
make tracker init_kwargs configurable
|
2023-07-20 22:05:55 +09:00 |
|
Kohya S
|
acf16c063a
|
make to work with PyTorch 1.12
|
2023-07-20 21:41:16 +09:00 |
|
Kohya S
|
fc276a51fb
|
fix invalid args checking in sdxl TI training
|
2023-07-20 14:50:57 +09:00 |
|
Kohya S
|
225e871819
|
enable full bf16 trainint in train_network
|
2023-07-19 08:41:42 +09:00 |
|
Kohya S
|
6d2d8dfd2f
|
add zero_terminal_snr option
|
2023-07-18 23:17:23 +09:00 |
|
Kohya S
|
0ec7166098
|
make crop top/left same as stabilityai's prep
|
2023-07-18 21:39:36 +09:00 |
|
Kohya S
|
41d195715d
|
fix scheduler steps with gradient accumulation
|
2023-07-16 15:56:29 +09:00 |
|
Kohya S
|
516f64f4d9
|
add caching to disk for text encoder outputs
|
2023-07-16 14:53:47 +09:00 |
|
Kohya S
|
94c151aea3
|
refactor caching latents (flip in same npz, etc)
|
2023-07-15 18:28:33 +09:00 |
|
Kohya S
|
81fa54837f
|
fix sampling in multi GPU training
|
2023-07-15 11:21:14 +09:00 |
|
Kohya S
|
9de357e373
|
fix tokenizer 2 is not same as open clip tokenizer
|
2023-07-14 12:27:19 +09:00 |
|
Kohya S
|
b4a3824ce4
|
change tokenizer from open clip to transformers
|
2023-07-13 20:49:26 +09:00 |
|
Kohya S
|
3bb80ebf20
|
fix sampling gen fails in lora training
|
2023-07-13 19:02:34 +09:00 |
|
Kohya S
|
8fa5fb2816
|
support diffusers format for SDXL
|
2023-07-12 21:57:14 +09:00 |
|
Kohya S
|
8df948565a
|
remove unnecessary code
|
2023-07-12 21:53:02 +09:00 |
|
Kohya S
|
814996b14f
|
fix NaN in sampling image
|
2023-07-11 23:18:35 +09:00 |
|
ddPn08
|
b841dd78fe
|
make tracker init_kwargs configurable
|
2023-07-11 10:21:45 +09:00 |
|
Kohya S
|
f54b784d88
|
support textual inversion training
|
2023-07-10 22:04:02 +09:00 |
|
Kohya S
|
b6e328ea8f
|
don't hold latent on memory for finetuning dataset
|
2023-07-10 08:46:15 +09:00 |
|
Kohya S
|
c2ceb6de5f
|
fix uncond/cond order
|
2023-07-09 21:14:12 +09:00 |
|
Kohya S
|
77ec70d145
|
fix conditioning
|
2023-07-09 19:00:38 +09:00 |
|
Kohya S
|
a380502c01
|
fix pad token is not handled
|
2023-07-09 18:13:49 +09:00 |
|