Commit Graph

19 Commits

Author SHA1 Message Date
rockerBOO
f62c68df3c Make grad_norm and combined_grad_norm None is not recording 2025-05-01 01:37:57 -04:00
rockerBOO
182544dcce Remove pertubation seed 2025-03-26 14:23:04 -04:00
rockerBOO
3647d065b5 Cache weight norms estimate on initialization. Move to update norms every step 2025-03-18 14:25:09 -04:00
rockerBOO
ea53290f62 Add LoRA-GGPO for Flux 2025-03-06 00:00:38 -05:00
Kohya S
0cbe95bcc7 fix text_encoder_lr to work with int closes #1608 2024-09-17 21:21:28 +09:00
Kohya S
d8d15f1a7e add support for specifying blocks in FLUX.1 LoRA training 2024-09-16 23:14:09 +09:00
Kohya S
c9ff4de905 Add support for specifying rank for each layer in FLUX.1 2024-09-14 22:17:52 +09:00
Kohya S
2d8ee3c280 OFT for FLUX.1 2024-09-14 15:48:16 +09:00
Kohya S
cefe52629e fix to work old notation for TE LR in .toml 2024-09-12 12:36:07 +09:00
Kohya S
d10ff62a78 support individual LR for CLIP-L/T5XXL 2024-09-10 20:32:09 +09:00
Kohya S
56cb2fc885 support T5XXL LoRA, reduce peak memory usage #1560 2024-09-04 23:15:27 +09:00
Kohya S
b65ae9b439 T5XXL LoRA training, fp8 T5XXL support 2024-09-04 21:33:17 +09:00
Kohya S
0087a46e14 FLUX.1 LoRA supports CLIP-L 2024-08-27 19:59:40 +09:00
Kohya S
5639c2adc0 fix typo 2024-08-24 16:37:49 +09:00
Kohya S
cf689e7aa6 feat: Add option to split projection layers and apply LoRA 2024-08-24 16:35:43 +09:00
Kohya S
56d7651f08 add experimental split mode for FLUX 2024-08-13 22:28:39 +09:00
Kohya S
8a0f12dde8 update FLUX LoRA training 2024-08-10 23:42:05 +09:00
Kohya S
358f13f2c9 fix alpha is ignored 2024-08-10 14:03:59 +09:00
Kohya S
36b2e6fc28 add FLUX.1 LoRA training 2024-08-09 22:56:48 +09:00