Commit Graph

258 Commits

Author SHA1 Message Date
rockerBOO
e0f1ae0f2c Fix default initialization 2025-05-07 23:41:48 -04:00
rockerBOO
19b67643e4 Revert "Fix default initialization"
This reverts commit 554674909a.
2025-05-07 23:33:54 -04:00
rockerBOO
554674909a Fix default initialization 2025-05-07 23:33:08 -04:00
rockerBOO
ef8371243b Add lowrank SVD for PiSSA. Implement URAE conversion 2025-05-07 23:26:26 -04:00
rockerBOO
5391c4fbe9 Fix typo 2025-04-10 22:07:46 -04:00
rockerBOO
adb0e54093 Fix GGPO variables. Fix no _org lora values.
- Add pythonpath = . to pytest to get the current directory
- Fix device of LoRA after PiSSA initialization to return to proper
  device
2025-04-10 21:59:39 -04:00
rockerBOO
9d7e2dd7c9 Fix LoRA dtype when saving PiSSA
Change lora_util to network_utils to match terms.
2025-04-10 20:59:47 -04:00
rockerBOO
5f927444d0 Merge branch 'sd3' into flux-lora-init 2025-04-04 15:54:15 -04:00
rockerBOO
87fe284a76 Properly move original model weights to device, offload org lora weights to CPU 2025-04-03 20:32:34 -04:00
Kohya S.
59d98e45a9 Merge pull request #1974 from rockerBOO/lora-ggpo
Add LoRA-GGPO for Flux
2025-03-30 21:07:31 +09:00
rockerBOO
182544dcce Remove pertubation seed 2025-03-26 14:23:04 -04:00
rockerBOO
da47d17231 Make sure on better device (cuda if available) for initialization 2025-03-25 20:43:10 -04:00
rockerBOO
54d4de0e72 Autocast shouldn't be on dtype float32 2025-03-25 19:05:03 -04:00
rockerBOO
0ad3b3c2bd Update initialization, add lora_util, add tests 2025-03-25 18:22:07 -04:00
rockerBOO
0bad5ae9f1 Detach and clone original LoRA weights before training 2025-03-24 19:01:11 -04:00
rockerBOO
3356314002 Add PiSSA decomposition before saving 2025-03-24 16:31:17 -04:00
rockerBOO
58bdf85ab4 Remove rank stabilization 2025-03-24 04:22:12 -04:00
rockerBOO
85928dd3b0 Add initialization URAE, PiSSA for flux 2025-03-24 04:08:08 -04:00
Kohya S
6364379f17 Merge branch 'dev' into sd3 2025-03-21 22:07:50 +09:00
rockerBOO
3647d065b5 Cache weight norms estimate on initialization. Move to update norms every step 2025-03-18 14:25:09 -04:00
rockerBOO
ea53290f62 Add LoRA-GGPO for Flux 2025-03-06 00:00:38 -05:00
Ivan Chikish
acdca2abb7 Fix [occasionally] missing text encoder attn modules
Should fix #1952
I added alternative name for CLIPAttention.
I have no idea why this name changed.
Now it should accept both names.
2025-03-01 20:35:45 +03:00
Dango233
e54462a4a9 Fix SD3 trained lora loading and merging 2024-11-07 09:54:12 +00:00
kohya-ss
b502f58488 Fix emb_dim to work. 2024-10-29 23:29:50 +09:00
kohya-ss
ce5b532582 Fix additional LoRA to work 2024-10-29 22:29:24 +09:00
kohya-ss
0af4edd8a6 Fix split_qkv 2024-10-29 21:51:56 +09:00
Kohya S
b649bbf2b6 Merge branch 'sd3' into sd3_5_support 2024-10-27 10:20:35 +09:00
Kohya S
731664b8c3 Merge branch 'dev' into sd3 2024-10-27 10:20:14 +09:00
Kohya S
e070bd9973 Merge branch 'main' into dev 2024-10-27 10:19:55 +09:00
Kohya S
ca44e3e447 reduce VRAM usage, instead of increasing main RAM usage 2024-10-27 10:19:05 +09:00
Kohya S
150579db32 Merge branch 'sd3' into sd3_5_support 2024-10-26 22:03:41 +09:00
Kohya S
8549669f89 Merge branch 'dev' into sd3 2024-10-26 22:02:57 +09:00
Kohya S
900d551a6a Merge branch 'main' into dev 2024-10-26 22:02:36 +09:00
Kohya S
56b4ea963e Fix LoRA metadata hash calculation bug in svd_merge_lora.py, sdxl_merge_lora.py, and resize_lora.py closes #1722 2024-10-26 22:01:10 +09:00
kohya-ss
d2c549d7b2 support SD3 LoRA 2024-10-25 21:58:31 +09:00
Kohya S
822fe57859 add workaround for 'Some tensors share memory' error #1614 2024-09-28 20:57:27 +09:00
Kohya S
706a48d50e Merge branch 'dev' into sd3 2024-09-19 21:15:39 +09:00
Kohya S
d7e14721e2 Merge branch 'main' into dev 2024-09-19 21:15:19 +09:00
Kohya S
9c757c2fba fix SDXL block index to match LBW 2024-09-19 21:14:57 +09:00
Kohya S
0cbe95bcc7 fix text_encoder_lr to work with int closes #1608 2024-09-17 21:21:28 +09:00
Kohya S
d8d15f1a7e add support for specifying blocks in FLUX.1 LoRA training 2024-09-16 23:14:09 +09:00
Kohya S
c9ff4de905 Add support for specifying rank for each layer in FLUX.1 2024-09-14 22:17:52 +09:00
Kohya S
2d8ee3c280 OFT for FLUX.1 2024-09-14 15:48:16 +09:00
Kohya S
0485f236a0 Merge branch 'dev' into sd3 2024-09-13 22:39:24 +09:00
Kohya S
93d9fbf607 improve OFT implementation closes #944 2024-09-13 22:37:11 +09:00
Kohya S
c15a3a1a65 Merge branch 'dev' into sd3 2024-09-13 21:30:49 +09:00
Kohya S
43ad73860d Merge branch 'main' into dev 2024-09-13 21:29:51 +09:00
Kohya S
b755ebd0a4 add LBW support for SDXL merge LoRA 2024-09-13 21:29:31 +09:00
Kohya S
f4a0bea6dc format by black 2024-09-13 21:26:06 +09:00
terracottahaniwa
734d2e5b2b Support Lora Block Weight (LBW) to svd_merge_lora.py (#1575)
* support lora block weight

* solve license incompatibility

* Fix issue: lbw index calculation
2024-09-13 20:45:35 +09:00