Ivan Chikish
acdca2abb7
Fix [occasionally] missing text encoder attn modules
...
Should fix #1952
I added alternative name for CLIPAttention.
I have no idea why this name changed.
Now it should accept both names.
2025-03-01 20:35:45 +03:00
Kohya S
afb971f9c3
fix SD1.5 LoRA extraction #1490
2024-08-22 21:33:15 +09:00
rockerBOO
00513b9b70
Add LoRA+ LR Ratio info message to logger
2024-05-23 22:27:12 -04:00
Kohya S
146edce693
support Diffusers' based SDXL LoRA key for inference
2024-05-18 11:05:04 +09:00
Kohya S
16677da0d9
fix create_network_from_weights doesn't work
2024-05-12 22:15:07 +09:00
Kohya S
44190416c6
update docs etc.
2024-05-12 17:01:20 +09:00
Kohya S
7fe81502d0
update loraplus on dylora/lofa_fa
2024-05-06 11:09:32 +09:00
Kohya S
58c2d856ae
support block dim/lr for sdxl
2024-05-03 22:18:20 +09:00
Kohya S
969f82ab47
move loraplus args from args to network_args, simplify log lr desc
2024-04-29 20:04:25 +09:00
Kohya S
834445a1d6
Merge pull request #1233 from rockerBOO/lora-plus
...
Add LoRA+ support
2024-04-29 18:05:12 +09:00
rockerBOO
68467bdf4d
Fix unset or invalid LR from making a param_group
2024-04-11 17:33:19 -04:00
rockerBOO
75833e84a1
Fix default LR, Add overall LoRA+ ratio, Add log
...
`--loraplus_ratio` added for both TE and UNet
Add log for lora+
2024-04-08 19:23:02 -04:00
rockerBOO
1933ab4b48
Fix default_lr being applied
2024-04-03 12:46:34 -04:00
Kohya S
b748b48dbb
fix attention couple+deep shink cause error in some reso
2024-04-03 12:43:08 +09:00
rockerBOO
f99fe281cb
Add LoRA+ support
2024-04-01 15:38:26 -04:00
Kohya S
cbe9c5dc06
supprt deep shink with regional lora, add prompter module
2024-02-12 14:17:27 +09:00
Yuta Hayashibe
5f6bf29e52
Replace print with logger if they are logs ( #905 )
...
* Add get_my_logger()
* Use logger instead of print
* Fix log level
* Removed line-breaks for readability
* Use setup_logging()
* Add rich to requirements.txt
* Make simple
* Use logger instead of print
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2024-02-04 18:14:34 +09:00
Kohya S
207fc8b256
fix to work regional LoRA
2023-09-03 17:50:27 +09:00
Kohya S
66c03be45f
Fix TE key names for SD1/2 LoRA are invalid
2023-07-08 09:56:38 +09:00
Kohya S
747af145ed
add sdxl fine-tuning and LoRA
2023-06-26 08:07:24 +09:00
Kohya S
92e50133f8
Merge branch 'original-u-net' into dev
2023-06-17 21:57:08 +09:00
Kohya S
bb91a10b5f
fix to work LyCORIS<0.1.6
2023-06-06 21:59:57 +09:00
Kohya S
5bec05e045
move max_norm to lora to avoid crashing in lycoris
2023-06-03 12:42:32 +09:00
Kohya S
0f0158ddaa
scale in rank dropout, check training in dropout
2023-06-02 07:29:59 +09:00
Kohya S
dde7807b00
add rank dropout/module dropout
2023-06-01 22:21:36 +09:00
ddPn08
c8d209d36c
update diffusers to 1.16 | train_network
2023-06-01 20:39:26 +09:00
Kohya S
f8e8df5a04
fix crash gen script, change to network_dropout
2023-06-01 20:07:04 +09:00
AI-Casanova
9c7237157d
Dropout and Max Norm Regularization for LoRA training ( #545 )
...
* Instantiate max_norm
* minor
* Move to end of step
* argparse
* metadata
* phrasing
* Sqrt ratio and logging
* fix logging
* Dropout test
* Dropout Args
* Dropout changed to affect LoRA only
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-06-01 14:58:38 +09:00
琴动我心
f7a1868fc2
fix: support old LoRA without alpha raise "TypeError: argument of type 'int' is not iterable "
2023-05-22 17:15:51 +08:00
Kohya S
2767a0f9f2
common block lr args processing in create
2023-05-11 21:47:59 +09:00
Kohya S
fdbdb4748a
pre calc LoRA in generating
2023-05-07 09:57:54 +09:00
Kohya S
5c020bed49
Add attension couple+reginal LoRA
2023-04-06 08:11:54 +09:00
Kohya S
83c7e03d05
Fix network_weights not working in train_network
2023-04-03 22:45:28 +09:00
Kohya S
6134619998
Add block dim(rank) feature
2023-04-03 21:19:49 +09:00
Kohya S
3beddf341e
Suppor LR graphs for each block, base lr
2023-04-03 08:43:11 +09:00
Kohya S
c639cb7d5d
support older type hint
2023-04-02 16:18:04 +09:00
Kohya S
97e65bf93f
change 'stratify' to 'block', add en message
2023-04-02 16:10:09 +09:00
u-haru
19340d82e6
層別学習率を使わない場合にparamsをまとめる
2023-04-02 12:57:55 +09:00
u-haru
058e442072
レイヤー数変更(hako-mikan/sd-webui-lora-block-weight参考)
2023-04-02 04:02:34 +09:00
u-haru
3032a47af4
cosineをsineのreversedに変更
2023-03-31 01:42:57 +09:00
u-haru
1b75dbd4f2
引数名に_lrを追加
2023-03-31 01:40:29 +09:00
u-haru
dade23a414
stratified_zero_thresholdに変更
2023-03-31 01:14:03 +09:00
u-haru
4dacc52bde
implement stratified_lr
2023-03-31 00:39:35 +09:00
Kohya S
2d6faa9860
support LoRA merge in advance
2023-03-30 21:34:36 +09:00
Kohya S
bf3674c1db
format by black
2023-03-29 21:23:27 +09:00
Kohya S
75d1883da6
fix LoRA rank is limited to target dim
2023-03-10 21:12:15 +09:00
Kohya S
e7051d427c
fix default conv alpha to 1
2023-03-09 20:26:14 +09:00
Kohya S
c4b4d1cb40
fix LoRA always expanded to Conv2d-3x3
2023-03-09 08:47:13 +09:00
Kohya S
19386df6e9
expand LoRA to all Conv2d
2023-03-06 22:03:09 +09:00
Kohya S
fe4f4446f1
Add region control for LoRA
2023-03-04 18:03:11 +09:00