Kohya S
f7b1911f1b
Merge branch 'main' into dev
2023-06-08 22:03:06 +09:00
Kohya S
045cd38b6e
fix clip_skip not work in weight capt, sample gen
2023-06-08 22:02:46 +09:00
Kohya S
dccdb8771c
support sample generation in training
2023-06-07 08:12:52 +09:00
Kohya S
d4b5cab7f7
Merge branch 'main' into original-u-net
2023-06-07 07:42:27 +09:00
Kohya S
363f1dfab9
Merge pull request #569 from kohya-ss/dev
...
older lycoris support, BREAK support
2023-06-06 22:07:21 +09:00
Kohya S
4e24733f1c
update readme
2023-06-06 22:03:21 +09:00
Kohya S
bb91a10b5f
fix to work LyCORIS<0.1.6
2023-06-06 21:59:57 +09:00
Kohya S
98635ebde2
Merge branch 'main' into dev
2023-06-06 21:54:29 +09:00
Kohya S
24823b061d
support BREAK in generation script
2023-06-06 21:53:58 +09:00
Kohya S
0fe1afd4ef
Merge pull request #562 from u-haru/hotfix/max_mean_logs_with_loss
...
loss表示追加
2023-06-05 21:42:25 +09:00
Kohya S
c0a7df9ee1
fix eps value, enable xformers, etc.
2023-06-03 21:29:27 +09:00
u-haru
5907bbd9de
loss表示追加
2023-06-03 21:20:26 +09:00
Kohya S
5db792b10b
initial commit for original U-Net
2023-06-03 19:24:47 +09:00
Kohya S
7c38c33ed6
Merge pull request #560 from kohya-ss/dev
...
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:46:02 +09:00
Kohya S
5bec05e045
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:42:32 +09:00
Kohya S
6084611508
Merge pull request #559 from kohya-ss/dev
...
max norm, dropout, scale v-pred loss
2023-06-03 11:40:56 +09:00
Kohya S
71a7a27319
update readme
2023-06-03 11:33:18 +09:00
Kohya S
ec2efe52e4
scale v-pred loss like noise pred
2023-06-03 10:52:22 +09:00
Kohya S
0f0158ddaa
scale in rank dropout, check training in dropout
2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00
add rank dropout/module dropout
2023-06-01 22:21:36 +09:00
ddPn08
1e3daa247b
fix bucketing
2023-06-01 21:58:45 +09:00
ddPn08
3bd00b88c2
support for controlnet in sample output
2023-06-01 20:48:30 +09:00
ddPn08
62d00b4520
add controlnet training
2023-06-01 20:48:25 +09:00
ddPn08
4f8ce00477
update diffusers to 1.16 | finetune
2023-06-01 20:47:54 +09:00
ddPn08
1214f35985
update diffusers to 1.16 | train_db
2023-06-01 20:39:31 +09:00
ddPn08
e743ee5d5c
update diffusers to 1.16 | dylora
2023-06-01 20:39:30 +09:00
ddPn08
23c4e5cb01
update diffusers to 1.16 | train_textual_inversion
2023-06-01 20:39:29 +09:00
ddPn08
1f1cae6c5a
make the device of snr_weight the same as loss
2023-06-01 20:39:28 +09:00
ddPn08
c8d209d36c
update diffusers to 1.16 | train_network
2023-06-01 20:39:26 +09:00
Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
Kohya S
f4c9276336
add scaling to max norm
2023-06-01 19:46:17 +09:00
Kohya S
a5c38e5d5b
fix crashing when max_norm is diabled
2023-06-01 19:32:22 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
TingTingin
5931948adb
Adjusted English grammar in logs to be more clear ( #554 )
...
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
* Update train_network.py
2023-06-01 12:31:33 +09:00
Kohya S
8a5e3904a0
Merge pull request #553 from kohya-ss/dev
...
no caption warning, network merging before training
2023-05-31 21:04:50 +09:00
Kohya S
d679dc4de1
Merge branch 'main' into dev
2023-05-31 20:58:32 +09:00
Kohya S
a002d10a4d
update readme
2023-05-31 20:57:01 +09:00
Kohya S
3a06968332
warn and continue if huggingface uploading failed
2023-05-31 20:48:33 +09:00
Kohya S
6fbd526931
show multiplier for base weights to console
2023-05-31 20:23:19 +09:00
Kohya S
c437dce056
change option name for merging network weights
2023-05-30 23:19:29 +09:00
Kohya S
fc00691898
enable multiple module weights
2023-05-30 23:10:41 +09:00
Kohya S
990ceddd14
show warning if no caption and no class token
2023-05-30 22:53:50 +09:00
Kohya S
226db64736
Merge pull request #542 from u-haru/feature/differential_learning
...
差分学習機能追加
2023-05-29 08:38:46 +09:00
Kohya S
2429ac73b2
Merge pull request #533 from TingTingin/main
...
Added warning on training without captions
2023-05-29 08:37:33 +09:00
u-haru
dd8e17cb37
差分学習機能追加
2023-05-27 05:15:02 +09:00
TingTingin
db756e9a34
Update train_util.py
...
I removed the sleep since it triggers per subset and if someone had a lot of subsets it would trigger multiple times
2023-05-26 08:08:34 -04:00
Kohya S
16e5981d31
Merge pull request #538 from kohya-ss/dev
...
update train_network doc. add warning to merge_lora.py
2023-05-25 22:24:16 +09:00
Kohya S
575c51fd3b
Merge branch 'main' into dev
2023-05-25 22:14:40 +09:00
Kohya S
5b2447f71d
add warning to merge_lora.py
2023-05-25 22:14:21 +09:00
Kohya S
0ccb4d4a3a
Merge pull request #537 from kohya-ss/dev
...
support D-Adaptation v3.0
2023-05-25 22:05:24 +09:00