Commit Graph

801 Commits

Author SHA1 Message Date
ykume
4b7b3bc04a fix saved SD dict is invalid for VAE 2023-06-11 17:35:00 +09:00
ykume
035dd3a900 fix mem_eff_attn does not work 2023-06-11 17:08:21 +09:00
ykume
4e25c8f78e fix to work with Diffusers 0.17.0 2023-06-11 16:57:17 +09:00
ykume
7f6b581ef8 support memory efficient attn (not xformers) 2023-06-11 16:54:41 +09:00
ykume
cc274fb7fb update diffusers ver, remove tensorflow 2023-06-11 16:54:10 +09:00
Kohya S
dccdb8771c support sample generation in training 2023-06-07 08:12:52 +09:00
Kohya S
d4b5cab7f7 Merge branch 'main' into original-u-net 2023-06-07 07:42:27 +09:00
Kohya S
363f1dfab9 Merge pull request #569 from kohya-ss/dev
older lycoris support, BREAK support
2023-06-06 22:07:21 +09:00
Kohya S
4e24733f1c update readme 2023-06-06 22:03:21 +09:00
Kohya S
bb91a10b5f fix to work LyCORIS<0.1.6 2023-06-06 21:59:57 +09:00
Kohya S
98635ebde2 Merge branch 'main' into dev 2023-06-06 21:54:29 +09:00
Kohya S
24823b061d support BREAK in generation script 2023-06-06 21:53:58 +09:00
Kohya S
0fe1afd4ef Merge pull request #562 from u-haru/hotfix/max_mean_logs_with_loss
loss表示追加
2023-06-05 21:42:25 +09:00
Kohya S
c0a7df9ee1 fix eps value, enable xformers, etc. 2023-06-03 21:29:27 +09:00
u-haru
5907bbd9de loss表示追加 2023-06-03 21:20:26 +09:00
Kohya S
5db792b10b initial commit for original U-Net 2023-06-03 19:24:47 +09:00
Kohya S
7c38c33ed6 Merge pull request #560 from kohya-ss/dev
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:46:02 +09:00
Kohya S
5bec05e045 move max_norm to lora to avoid crashing in lycoris 2023-06-03 12:42:32 +09:00
Kohya S
6084611508 Merge pull request #559 from kohya-ss/dev
max norm, dropout, scale v-pred loss
2023-06-03 11:40:56 +09:00
Kohya S
71a7a27319 update readme 2023-06-03 11:33:18 +09:00
Kohya S
ec2efe52e4 scale v-pred loss like noise pred 2023-06-03 10:52:22 +09:00
Kohya S
0f0158ddaa scale in rank dropout, check training in dropout 2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00 add rank dropout/module dropout 2023-06-01 22:21:36 +09:00
Kohya S
f8e8df5a04 fix crash gen script, change to network_dropout 2023-06-01 20:07:04 +09:00
Kohya S
f4c9276336 add scaling to max norm 2023-06-01 19:46:17 +09:00
Kohya S
a5c38e5d5b fix crashing when max_norm is diabled 2023-06-01 19:32:22 +09:00
AI-Casanova
9c7237157d Dropout and Max Norm Regularization for LoRA training (#545)
* Instantiate max_norm

* minor

* Move to end of step

* argparse

* metadata

* phrasing

* Sqrt ratio and logging

* fix logging

* Dropout test

* Dropout Args

* Dropout changed to affect LoRA only

---------

Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com>
2023-06-01 14:58:38 +09:00
TingTingin
5931948adb Adjusted English grammar in logs to be more clear (#554)
* Update train_network.py

* Update train_network.py

* Update train_network.py

* Update train_network.py

* Update train_network.py

* Update train_network.py
2023-06-01 12:31:33 +09:00
Kohya S
8a5e3904a0 Merge pull request #553 from kohya-ss/dev
no caption warning, network merging before training
2023-05-31 21:04:50 +09:00
Kohya S
d679dc4de1 Merge branch 'main' into dev 2023-05-31 20:58:32 +09:00
Kohya S
a002d10a4d update readme 2023-05-31 20:57:01 +09:00
Kohya S
3a06968332 warn and continue if huggingface uploading failed 2023-05-31 20:48:33 +09:00
Kohya S
6fbd526931 show multiplier for base weights to console 2023-05-31 20:23:19 +09:00
Kohya S
c437dce056 change option name for merging network weights 2023-05-30 23:19:29 +09:00
Kohya S
fc00691898 enable multiple module weights 2023-05-30 23:10:41 +09:00
Kohya S
990ceddd14 show warning if no caption and no class token 2023-05-30 22:53:50 +09:00
Kohya S
226db64736 Merge pull request #542 from u-haru/feature/differential_learning
差分学習機能追加
2023-05-29 08:38:46 +09:00
Kohya S
2429ac73b2 Merge pull request #533 from TingTingin/main
Added warning on training without captions
2023-05-29 08:37:33 +09:00
u-haru
dd8e17cb37 差分学習機能追加 2023-05-27 05:15:02 +09:00
TingTingin
db756e9a34 Update train_util.py
I removed the sleep since it triggers per subset and if someone had a lot of subsets it would trigger multiple times
2023-05-26 08:08:34 -04:00
Kohya S
16e5981d31 Merge pull request #538 from kohya-ss/dev
update train_network doc. add warning to merge_lora.py
2023-05-25 22:24:16 +09:00
Kohya S
575c51fd3b Merge branch 'main' into dev 2023-05-25 22:14:40 +09:00
Kohya S
5b2447f71d add warning to merge_lora.py 2023-05-25 22:14:21 +09:00
Kohya S
0ccb4d4a3a Merge pull request #537 from kohya-ss/dev
support D-Adaptation v3.0
2023-05-25 22:05:24 +09:00
Kohya S
b5bb8bec67 update readme 2023-05-25 22:03:04 +09:00
青龍聖者@bdsqlsz
5cdf4e34a1 support for dadapaption V3 (#530)
* Update train_util.py for DAdaptLion

* Update train_README-zh.md for dadaptlion

* Update train_README-ja.md for DAdaptLion

* add DAdatpt V3

* Alignment

* Update train_util.py for experimental

* Update train_util.py V3

* Update train_README-zh.md

* Update train_README-ja.md

* Update train_util.py fix

* Update train_util.py

---------

Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com>
2023-05-25 21:52:36 +09:00
TingTingin
061e157191 Update train_util.py 2023-05-23 02:02:39 -04:00
TingTingin
d859a3a925 Update train_util.py
fix mistake
2023-05-23 02:00:33 -04:00
TingTingin
5a1a14f9fc Update train_util.py
Added feature to add "." if missing in caption_extension
Added warning on training without captions
2023-05-23 01:57:35 -04:00
Kohya S
b6ba4cac83 Merge pull request #528 from kohya-ss/dev
save_state handling, old LoRA support etc.
2023-05-22 18:51:18 +09:00