Kohya S
18156bf2a1
fix same replacement multiple times in dyn prompt
2023-06-15 22:22:12 +09:00
Kohya S
5845de7d7c
common lr checking for dadaptation and prodigy
2023-06-15 21:47:37 +09:00
青龍聖者@bdsqlsz
e97d67a681
Support for Prodigy(Dadapt variety for Dylora) ( #585 )
...
* Update train_util.py for DAdaptLion
* Update train_README-zh.md for dadaptlion
* Update train_README-ja.md for DAdaptLion
* add DAdatpt V3
* Alignment
* Update train_util.py for experimental
* Update train_util.py V3
* Update train_README-zh.md
* Update train_README-ja.md
* Update train_util.py fix
* Update train_util.py
* support Prodigy
* add lower
2023-06-15 21:12:53 +09:00
Kohya S
f0bb3ae825
add an option to disable controlnet in 2nd stage
2023-06-15 20:56:12 +09:00
Kohya S
9806b00f74
add arbitrary dataset feature to each script
2023-06-15 20:39:39 +09:00
Kohya S
f2989b36c2
fix typos, add comment
2023-06-15 20:37:01 +09:00
Kohya S
624fbadea2
fix dynamic prompt with from_file
2023-06-15 19:19:16 +09:00
Kohya S
d4ba37f543
supprot dynamic prompt variants
2023-06-15 13:22:06 +09:00
Kohya S
1da6d43109
Merge branch 'main' into dev
2023-06-14 12:49:37 +09:00
Kohya S
9aee793078
support arbitrary dataset for train_network.py
2023-06-14 12:49:12 +09:00
Kohya S
89c3033401
Merge pull request #581 from mio2333/patch-1
...
Update make_captions.py
2023-06-12 22:15:30 +09:00
mio
334d07bf96
Update make_captions.py
...
Append sys path for make_captions.py to load blip module in the same folder to fix the error when you don't run this script under the folder
2023-06-08 23:39:06 +08:00
Kohya S
6417f5d7c1
Merge pull request #580 from kohya-ss/dev
...
fix clip skip not working in weighted caption training and sample gen
2023-06-08 22:10:30 +09:00
Kohya S
8088c04a71
update readme
2023-06-08 22:06:34 +09:00
Kohya S
f7b1911f1b
Merge branch 'main' into dev
2023-06-08 22:03:06 +09:00
Kohya S
045cd38b6e
fix clip_skip not work in weight capt, sample gen
2023-06-08 22:02:46 +09:00
Kohya S
363f1dfab9
Merge pull request #569 from kohya-ss/dev
...
older lycoris support, BREAK support
2023-06-06 22:07:21 +09:00
Kohya S
4e24733f1c
update readme
2023-06-06 22:03:21 +09:00
Kohya S
bb91a10b5f
fix to work LyCORIS<0.1.6
2023-06-06 21:59:57 +09:00
Kohya S
98635ebde2
Merge branch 'main' into dev
2023-06-06 21:54:29 +09:00
Kohya S
24823b061d
support BREAK in generation script
2023-06-06 21:53:58 +09:00
Kohya S
0fe1afd4ef
Merge pull request #562 from u-haru/hotfix/max_mean_logs_with_loss
...
loss表示追加
2023-06-05 21:42:25 +09:00
u-haru
5907bbd9de
loss表示追加
2023-06-03 21:20:26 +09:00
Kohya S
7c38c33ed6
Merge pull request #560 from kohya-ss/dev
...
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:46:02 +09:00
Kohya S
5bec05e045
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:42:32 +09:00
Kohya S
6084611508
Merge pull request #559 from kohya-ss/dev
...
max norm, dropout, scale v-pred loss
2023-06-03 11:40:56 +09:00
Kohya S
71a7a27319
update readme
2023-06-03 11:33:18 +09:00
Kohya S
ec2efe52e4
scale v-pred loss like noise pred
2023-06-03 10:52:22 +09:00
Kohya S
0f0158ddaa
scale in rank dropout, check training in dropout
2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00
add rank dropout/module dropout
2023-06-01 22:21:36 +09:00
Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
Kohya S
f4c9276336
add scaling to max norm
2023-06-01 19:46:17 +09:00
Kohya S
a5c38e5d5b
fix crashing when max_norm is diabled
2023-06-01 19:32:22 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
TingTingin
5931948adb
Adjusted English grammar in logs to be more clear ( #554 )
...
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
2023-06-01 12:31:33 +09:00
Kohya S
8a5e3904a0
Merge pull request #553 from kohya-ss/dev
...
no caption warning, network merging before training
2023-05-31 21:04:50 +09:00
Kohya S
d679dc4de1
Merge branch 'main' into dev
2023-05-31 20:58:32 +09:00
Kohya S
a002d10a4d
update readme
2023-05-31 20:57:01 +09:00
Kohya S
3a06968332
warn and continue if huggingface uploading failed
2023-05-31 20:48:33 +09:00
Kohya S
6fbd526931
show multiplier for base weights to console
2023-05-31 20:23:19 +09:00
Kohya S
c437dce056
change option name for merging network weights
2023-05-30 23:19:29 +09:00
Kohya S
fc00691898
enable multiple module weights
2023-05-30 23:10:41 +09:00
Kohya S
990ceddd14
show warning if no caption and no class token
2023-05-30 22:53:50 +09:00
Kohya S
226db64736
Merge pull request #542 from u-haru/feature/differential_learning
...
差分学習機能追加
2023-05-29 08:38:46 +09:00
Kohya S
2429ac73b2
Merge pull request #533 from TingTingin/main
...
Added warning on training without captions
2023-05-29 08:37:33 +09:00
u-haru
dd8e17cb37
差分学習機能追加
2023-05-27 05:15:02 +09:00
TingTingin
db756e9a34
Update train_util.py
...
I removed the sleep since it triggers per subset and if someone had a lot of subsets it would trigger multiple times
2023-05-26 08:08:34 -04:00
Kohya S
16e5981d31
Merge pull request #538 from kohya-ss/dev
...
update train_network doc. add warning to merge_lora.py
2023-05-25 22:24:16 +09:00
Kohya S
575c51fd3b
Merge branch 'main' into dev
2023-05-25 22:14:40 +09:00
Kohya S
5b2447f71d
add warning to merge_lora.py
2023-05-25 22:14:21 +09:00