Kohya S
c0a7df9ee1
fix eps value, enable xformers, etc.
2023-06-03 21:29:27 +09:00
u-haru
5907bbd9de
loss表示追加
2023-06-03 21:20:26 +09:00
Kohya S
5db792b10b
initial commit for original U-Net
2023-06-03 19:24:47 +09:00
Kohya S
7c38c33ed6
Merge pull request #560 from kohya-ss/dev
...
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:46:02 +09:00
Kohya S
5bec05e045
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:42:32 +09:00
Kohya S
6084611508
Merge pull request #559 from kohya-ss/dev
...
max norm, dropout, scale v-pred loss
2023-06-03 11:40:56 +09:00
Kohya S
71a7a27319
update readme
2023-06-03 11:33:18 +09:00
Kohya S
ec2efe52e4
scale v-pred loss like noise pred
2023-06-03 10:52:22 +09:00
Kohya S
0f0158ddaa
scale in rank dropout, check training in dropout
2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00
add rank dropout/module dropout
2023-06-01 22:21:36 +09:00
ddPn08
1e3daa247b
fix bucketing
2023-06-01 21:58:45 +09:00
ddPn08
3bd00b88c2
support for controlnet in sample output
2023-06-01 20:48:30 +09:00
ddPn08
62d00b4520
add controlnet training
2023-06-01 20:48:25 +09:00
ddPn08
4f8ce00477
update diffusers to 1.16 | finetune
2023-06-01 20:47:54 +09:00
ddPn08
1214f35985
update diffusers to 1.16 | train_db
2023-06-01 20:39:31 +09:00
ddPn08
e743ee5d5c
update diffusers to 1.16 | dylora
2023-06-01 20:39:30 +09:00
ddPn08
23c4e5cb01
update diffusers to 1.16 | train_textual_inversion
2023-06-01 20:39:29 +09:00
ddPn08
1f1cae6c5a
make the device of snr_weight the same as loss
2023-06-01 20:39:28 +09:00
ddPn08
c8d209d36c
update diffusers to 1.16 | train_network
2023-06-01 20:39:26 +09:00
Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
Kohya S
f4c9276336
add scaling to max norm
2023-06-01 19:46:17 +09:00
Kohya S
a5c38e5d5b
fix crashing when max_norm is diabled
2023-06-01 19:32:22 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
TingTingin
5931948adb
Adjusted English grammar in logs to be more clear ( #554 )
...
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
2023-06-01 12:31:33 +09:00
Kohya S
8a5e3904a0
Merge pull request #553 from kohya-ss/dev
...
no caption warning, network merging before training
2023-05-31 21:04:50 +09:00
Kohya S
d679dc4de1
Merge branch 'main' into dev
2023-05-31 20:58:32 +09:00
Kohya S
a002d10a4d
update readme
2023-05-31 20:57:01 +09:00
Kohya S
3a06968332
warn and continue if huggingface uploading failed
2023-05-31 20:48:33 +09:00
Kohya S
6fbd526931
show multiplier for base weights to console
2023-05-31 20:23:19 +09:00
Kohya S
c437dce056
change option name for merging network weights
2023-05-30 23:19:29 +09:00
Kohya S
fc00691898
enable multiple module weights
2023-05-30 23:10:41 +09:00
Kohya S
990ceddd14
show warning if no caption and no class token
2023-05-30 22:53:50 +09:00
Kohya S
226db64736
Merge pull request #542 from u-haru/feature/differential_learning
...
差分学習機能追加
2023-05-29 08:38:46 +09:00
Kohya S
2429ac73b2
Merge pull request #533 from TingTingin/main
...
Added warning on training without captions
2023-05-29 08:37:33 +09:00
u-haru
dd8e17cb37
差分学習機能追加
2023-05-27 05:15:02 +09:00
TingTingin
db756e9a34
Update train_util.py
...
I removed the sleep since it triggers per subset and if someone had a lot of subsets it would trigger multiple times
2023-05-26 08:08:34 -04:00
Kohya S
16e5981d31
Merge pull request #538 from kohya-ss/dev
...
update train_network doc. add warning to merge_lora.py
2023-05-25 22:24:16 +09:00
Kohya S
575c51fd3b
Merge branch 'main' into dev
2023-05-25 22:14:40 +09:00
Kohya S
5b2447f71d
add warning to merge_lora.py
2023-05-25 22:14:21 +09:00
Kohya S
0ccb4d4a3a
Merge pull request #537 from kohya-ss/dev
...
support D-Adaptation v3.0
2023-05-25 22:05:24 +09:00
Kohya S
b5bb8bec67
update readme
2023-05-25 22:03:04 +09:00
青龍聖者@bdsqlsz
5cdf4e34a1
support for dadapaption V3 ( #530 )
...
* Update train_util.py for DAdaptLion
* Update train_README-zh.md for dadaptlion
* Update train_README-ja.md for DAdaptLion
* add DAdatpt V3
* Alignment
* Update train_util.py for experimental
* Update train_util.py V3
* Update train_README-zh.md
* Update train_README-ja.md
* Update train_util.py fix
* Update train_util.py
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-05-25 21:52:36 +09:00
TingTingin
061e157191
Update train_util.py
2023-05-23 02:02:39 -04:00
TingTingin
d859a3a925
Update train_util.py
...
fix mistake
2023-05-23 02:00:33 -04:00
TingTingin
5a1a14f9fc
Update train_util.py
...
Added feature to add "." if missing in caption_extension
Added warning on training without captions
2023-05-23 01:57:35 -04:00
Kohya S
b6ba4cac83
Merge pull request #528 from kohya-ss/dev
...
save_state handling, old LoRA support etc.
2023-05-22 18:51:18 +09:00
Kohya S
99b607c60c
update readme
2023-05-22 18:46:57 +09:00
Kohya S
289298b17d
Merge pull request #527 from Manjiz/main
...
fix: support old LoRA without alpha raise "TypeError: argument of typ…
2023-05-22 18:36:34 +09:00
琴动我心
f7a1868fc2
fix: support old LoRA without alpha raise "TypeError: argument of type 'int' is not iterable "
2023-05-22 17:15:51 +08:00
Kohya S
02bb8e0ac3
use xformers in VAE in gen script
2023-05-21 12:59:01 +09:00