Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
琴动我心
f7a1868fc2
fix: support old LoRA without alpha raise "TypeError: argument of type 'int' is not iterable "
2023-05-22 17:15:51 +08:00
Kohya S
2767a0f9f2
common block lr args processing in create
2023-05-11 21:47:59 +09:00
Kohya S
fdbdb4748a
pre calc LoRA in generating
2023-05-07 09:57:54 +09:00
Kohya S
5c020bed49
Add attension couple+reginal LoRA
2023-04-06 08:11:54 +09:00
Kohya S
83c7e03d05
Fix network_weights not working in train_network
2023-04-03 22:45:28 +09:00
Kohya S
6134619998
Add block dim(rank) feature
2023-04-03 21:19:49 +09:00
Kohya S
3beddf341e
Suppor LR graphs for each block, base lr
2023-04-03 08:43:11 +09:00
Kohya S
c639cb7d5d
support older type hint
2023-04-02 16:18:04 +09:00
Kohya S
97e65bf93f
change 'stratify' to 'block', add en message
2023-04-02 16:10:09 +09:00
u-haru
19340d82e6
層別学習率を使わない場合にparamsをまとめる
2023-04-02 12:57:55 +09:00
u-haru
058e442072
レイヤー数変更(hako-mikan/sd-webui-lora-block-weight参考)
2023-04-02 04:02:34 +09:00
u-haru
3032a47af4
cosineをsineのreversedに変更
2023-03-31 01:42:57 +09:00
u-haru
1b75dbd4f2
引数名に_lrを追加
2023-03-31 01:40:29 +09:00
u-haru
dade23a414
stratified_zero_thresholdに変更
2023-03-31 01:14:03 +09:00
u-haru
4dacc52bde
implement stratified_lr
2023-03-31 00:39:35 +09:00
Kohya S
2d6faa9860
support LoRA merge in advance
2023-03-30 21:34:36 +09:00
Kohya S
bf3674c1db
format by black
2023-03-29 21:23:27 +09:00
Kohya S
75d1883da6
fix LoRA rank is limited to target dim
2023-03-10 21:12:15 +09:00
Kohya S
e7051d427c
fix default conv alpha to 1
2023-03-09 20:26:14 +09:00
Kohya S
c4b4d1cb40
fix LoRA always expanded to Conv2d-3x3
2023-03-09 08:47:13 +09:00
Kohya S
19386df6e9
expand LoRA to all Conv2d
2023-03-06 22:03:09 +09:00
Kohya S
fe4f4446f1
Add region control for LoRA
2023-03-04 18:03:11 +09:00
Kohya S
d94c0d70fe
support network mul from prompt
2023-02-19 18:43:35 +09:00
Kohya S
b3020db63f
support python 3.8
2023-02-07 22:29:12 +09:00
Kohya S
e6bad080cb
Merge pull request #102 from space-nuko/precalculate-hashes
...
Precalculate .safetensors model hashes after training
2023-01-24 19:03:45 +09:00
Kohya S
bf3a13bb4e
Fix error for loading bf16 weights
2023-01-24 18:57:21 +09:00
space-nuko
f7fbdc4b2a
Precalculate .safetensors model hashes after training
2023-01-23 17:21:04 -08:00
Kohya S
b4636d4185
Add scaling alpha for LoRA
2023-01-21 20:37:34 +09:00
Kohya S
eba142ccb2
do not save metadata in .pt/.ckpt
2023-01-12 21:52:55 +09:00
Kohya S
9fd91d26a3
Store metadata to .ckpt as value of state dict
2023-01-12 10:54:21 +09:00
Kohya S
e4f9b2b715
Add VAE to meatada, add no_metadata option
2023-01-11 23:12:18 +09:00
space-nuko
de37fd9906
Fix metadata loading
2023-01-10 02:56:35 -08:00
space-nuko
2e4ce0fdff
Add training metadata to output LoRA model
2023-01-10 02:49:52 -08:00
Kohya S
445b34de1f
Add LoRA training/generating.
2022-12-25 21:34:59 +09:00