青龍聖者@bdsqlsz
d131bde183
Support for bitsandbytes 0.39.1 with Paged Optimizer(AdamW8bit and Lion8bit) ( #631 )
...
* ADD libbitsandbytes.dll for 0.38.1
* Delete libbitsandbytes_cuda116.dll
* Delete cextension.py
* add main.py
* Update requirements.txt for bitsandbytes 0.38.1
* Update README.md for bitsandbytes-windows
* Update README-ja.md for bitsandbytes 0.38.1
* Update main.py for return cuda118
* Update train_util.py for lion8bit
* Update train_README-ja.md for lion8bit
* Update train_util.py for add DAdaptAdan and DAdaptSGD
* Update train_util.py for DAdaptadam
* Update train_network.py for dadapt
* Update train_README-ja.md for DAdapt
* Update train_util.py for DAdapt
* Update train_network.py for DAdaptAdaGrad
* Update train_db.py for DAdapt
* Update fine_tune.py for DAdapt
* Update train_textual_inversion.py for DAdapt
* Update train_textual_inversion_XTI.py for DAdapt
* Revert "Merge branch 'qinglong' into main"
This reverts commit b65c023083 , reversing
changes made to f6fda20caf .
* Revert "Update requirements.txt for bitsandbytes 0.38.1"
This reverts commit 83abc60dfa .
* Revert "Delete cextension.py"
This reverts commit 3ba4dfe046 .
* Revert "Update README.md for bitsandbytes-windows"
This reverts commit 4642c52086 .
* Revert "Update README-ja.md for bitsandbytes 0.38.1"
This reverts commit fa6d7485ac .
* Update train_util.py
* Update requirements.txt
* support PagedAdamW8bit/PagedLion8bit
* Update requirements.txt
* update for PageAdamW8bit and PagedLion8bit
* Revert
* revert main
2023-07-22 19:45:32 +09:00
Kohya S
9a67e0df39
Merge pull request #610 from lubobill1990/patch-1
...
Update huggingface hub to resolve error in windows
v0.6.5
2023-07-20 21:45:38 +09:00
Bo Lu
7981ee186f
Update huggingface hub to resolve error in windows
...
https://github.com/huggingface/huggingface_hub/issues/1423
2023-06-26 01:53:23 +08:00
Kohya S
0cfcb5a49c
fix lr/d*lr is not logged with prodigy in finetune
2023-06-24 08:36:09 +09:00
Kohya S
c7fd336c5d
Merge pull request #594 from kohya-ss/dev
...
fix same random seed is used in multiple generation
2023-06-16 12:14:20 +09:00
Kohya S
ed30af8343
Merge branch 'main' into dev
2023-06-16 12:10:59 +09:00
Kohya S
1e0b059982
fix same seed is used for multiple generation
2023-06-16 12:10:18 +09:00
Kohya S
038c09f552
Merge pull request #590 from kohya-ss/dev
...
prodigyopt, arbitrary dataset etc.
2023-06-15 22:30:10 +09:00
Kohya S
5d1b54de45
update readme
2023-06-15 22:27:47 +09:00
Kohya S
18156bf2a1
fix same replacement multiple times in dyn prompt
2023-06-15 22:22:12 +09:00
Kohya S
5845de7d7c
common lr checking for dadaptation and prodigy
2023-06-15 21:47:37 +09:00
青龍聖者@bdsqlsz
e97d67a681
Support for Prodigy(Dadapt variety for Dylora) ( #585 )
...
* Update train_util.py for DAdaptLion
* Update train_README-zh.md for dadaptlion
* Update train_README-ja.md for DAdaptLion
* add DAdatpt V3
* Alignment
* Update train_util.py for experimental
* Update train_util.py V3
* Update train_README-zh.md
* Update train_README-ja.md
* Update train_util.py fix
* Update train_util.py
* support Prodigy
* add lower
2023-06-15 21:12:53 +09:00
Kohya S
f0bb3ae825
add an option to disable controlnet in 2nd stage
2023-06-15 20:56:12 +09:00
Kohya S
9806b00f74
add arbitrary dataset feature to each script
2023-06-15 20:39:39 +09:00
Kohya S
f2989b36c2
fix typos, add comment
2023-06-15 20:37:01 +09:00
Kohya S
624fbadea2
fix dynamic prompt with from_file
2023-06-15 19:19:16 +09:00
Kohya S
d4ba37f543
supprot dynamic prompt variants
2023-06-15 13:22:06 +09:00
Kohya S
1da6d43109
Merge branch 'main' into dev
2023-06-14 12:49:37 +09:00
Kohya S
9aee793078
support arbitrary dataset for train_network.py
2023-06-14 12:49:12 +09:00
Kohya S
89c3033401
Merge pull request #581 from mio2333/patch-1
...
Update make_captions.py
2023-06-12 22:15:30 +09:00
mio
334d07bf96
Update make_captions.py
...
Append sys path for make_captions.py to load blip module in the same folder to fix the error when you don't run this script under the folder
2023-06-08 23:39:06 +08:00
Kohya S
6417f5d7c1
Merge pull request #580 from kohya-ss/dev
...
fix clip skip not working in weighted caption training and sample gen
2023-06-08 22:10:30 +09:00
Kohya S
8088c04a71
update readme
2023-06-08 22:06:34 +09:00
Kohya S
f7b1911f1b
Merge branch 'main' into dev
2023-06-08 22:03:06 +09:00
Kohya S
045cd38b6e
fix clip_skip not work in weight capt, sample gen
2023-06-08 22:02:46 +09:00
Kohya S
363f1dfab9
Merge pull request #569 from kohya-ss/dev
...
older lycoris support, BREAK support
2023-06-06 22:07:21 +09:00
Kohya S
4e24733f1c
update readme
2023-06-06 22:03:21 +09:00
Kohya S
bb91a10b5f
fix to work LyCORIS<0.1.6
2023-06-06 21:59:57 +09:00
Kohya S
98635ebde2
Merge branch 'main' into dev
2023-06-06 21:54:29 +09:00
Kohya S
24823b061d
support BREAK in generation script
2023-06-06 21:53:58 +09:00
Kohya S
0fe1afd4ef
Merge pull request #562 from u-haru/hotfix/max_mean_logs_with_loss
...
loss表示追加
2023-06-05 21:42:25 +09:00
u-haru
5907bbd9de
loss表示追加
2023-06-03 21:20:26 +09:00
Kohya S
7c38c33ed6
Merge pull request #560 from kohya-ss/dev
...
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:46:02 +09:00
Kohya S
5bec05e045
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:42:32 +09:00
Kohya S
6084611508
Merge pull request #559 from kohya-ss/dev
...
max norm, dropout, scale v-pred loss
2023-06-03 11:40:56 +09:00
Kohya S
71a7a27319
update readme
2023-06-03 11:33:18 +09:00
Kohya S
ec2efe52e4
scale v-pred loss like noise pred
2023-06-03 10:52:22 +09:00
Kohya S
0f0158ddaa
scale in rank dropout, check training in dropout
2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00
add rank dropout/module dropout
2023-06-01 22:21:36 +09:00
Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
Kohya S
f4c9276336
add scaling to max norm
2023-06-01 19:46:17 +09:00
Kohya S
a5c38e5d5b
fix crashing when max_norm is diabled
2023-06-01 19:32:22 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
TingTingin
5931948adb
Adjusted English grammar in logs to be more clear ( #554 )
...
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
2023-06-01 12:31:33 +09:00
Kohya S
8a5e3904a0
Merge pull request #553 from kohya-ss/dev
...
no caption warning, network merging before training
2023-05-31 21:04:50 +09:00
Kohya S
d679dc4de1
Merge branch 'main' into dev
2023-05-31 20:58:32 +09:00
Kohya S
a002d10a4d
update readme
2023-05-31 20:57:01 +09:00
Kohya S
3a06968332
warn and continue if huggingface uploading failed
2023-05-31 20:48:33 +09:00
Kohya S
6fbd526931
show multiplier for base weights to console
2023-05-31 20:23:19 +09:00
Kohya S
c437dce056
change option name for merging network weights
2023-05-30 23:19:29 +09:00