Kohya S
e8cfd4ba1d
fix to work cond mask and alpha mask
2024-05-26 22:01:37 +09:00
Kohya S
da6fea3d97
simplify and update alpha mask to work with various cases
2024-05-19 21:26:18 +09:00
Kohya S
f2dd43e198
revert kwargs to explicit declaration
2024-05-19 19:23:59 +09:00
u-haru
db6752901f
画像のアルファチャンネルをlossのマスクとして使用するオプションを追加 ( #1223 )
...
* Add alpha_mask parameter and apply masked loss
* Fix type hint in trim_and_resize_if_required function
* Refactor code to use keyword arguments in train_util.py
* Fix alpha mask flipping logic
* Fix alpha mask initialization
* Fix alpha_mask transformation
* Cache alpha_mask
* Update alpha_masks to be on CPU
* Set flipped_alpha_masks to Null if option disabled
* Check if alpha_mask is None
* Set alpha_mask to None if option disabled
* Add description of alpha_mask option to docs
2024-05-19 19:07:25 +09:00
Kohya S
febc5c59fa
update README
2024-05-19 19:03:43 +09:00
Kohya S
4c798129b0
update README
2024-05-19 19:00:32 +09:00
Kohya S
38e4c602b1
Merge pull request #1277 from Cauldrath/negative_learning
...
Allow negative learning rate
2024-05-19 18:55:50 +09:00
Kohya S
e4d9e3c843
remove dependency for omegaconf #ref 1284
2024-05-19 17:46:07 +09:00
Kohya S
de0e0b9468
Merge pull request #1284 from sdbds/fix_traincontrolnet
...
Fix train controlnet
2024-05-19 17:39:15 +09:00
Kohya S
c68baae480
add --log_config option to enable/disable output training config
2024-05-19 17:21:04 +09:00
Kohya S
47187f7079
Merge pull request #1285 from ccharest93/main
...
Hyperparameter tracking
2024-05-19 16:31:33 +09:00
Kohya S
e3ddd1fbbe
update README and format code
2024-05-19 16:26:10 +09:00
Kohya S
0640f017ab
Merge pull request #1322 from aria1th/patch-1
...
Accelerate: fix get_trainable_params in controlnet-llite training
2024-05-19 16:23:01 +09:00
Kohya S
2f19175dfe
update README
2024-05-19 15:38:37 +09:00
Kohya S
146edce693
support Diffusers' based SDXL LoRA key for inference
2024-05-18 11:05:04 +09:00
Kohya S
153764a687
add prompt option '--f' for filename
2024-05-15 20:21:49 +09:00
Kohya S
589c2aa025
update README
2024-05-13 21:20:37 +09:00
Kohya S
16677da0d9
fix create_network_from_weights doesn't work
2024-05-12 22:15:07 +09:00
Kohya S
a384bf2187
Merge pull request #1313 from rockerBOO/patch-3
...
Add caption_separator to output for subset
2024-05-12 21:36:56 +09:00
Kohya S
1c296f7229
Merge pull request #1312 from rockerBOO/patch-2
...
Fix caption_separator missing in subset schema
2024-05-12 21:33:12 +09:00
Kohya S
e96a5217c3
Merge pull request #1291 from frodo821/patch-1
...
removed unnecessary `torch` import on line 115
2024-05-12 21:14:50 +09:00
Kohya S
39b82f26e5
update readme
2024-05-12 20:58:45 +09:00
Kohya S
3701507874
raise original error if error is occured in checking latents
2024-05-12 20:56:56 +09:00
Kohya S
78020936d2
Merge pull request #1278 from Cauldrath/catch_latent_error_file
...
Display name of error latent file
2024-05-12 20:46:25 +09:00
Kohya S
9ddb4d7a01
update readme and help message etc.
2024-05-12 17:55:08 +09:00
Kohya S
8d1b1acd33
Merge pull request #1266 from Zovjsra/feature/disable-mmap
...
Add "--disable_mmap_load_safetensors" parameter
2024-05-12 17:43:44 +09:00
Kohya S
02298e3c4a
Merge pull request #1331 from kohya-ss/lora-plus
...
Lora plus
2024-05-12 17:04:58 +09:00
Kohya S
44190416c6
update docs etc.
2024-05-12 17:01:20 +09:00
Kohya S
3c8193f642
revert lora+ for lora_fa
2024-05-12 17:00:51 +09:00
Kohya S
c6a437054a
Merge branch 'dev' into lora-plus
2024-05-12 16:18:57 +09:00
Kohya S
1ffc0b330a
fix typo
2024-05-12 16:18:43 +09:00
Kohya S
e01e148705
Merge branch 'dev' into lora-plus
2024-05-12 16:17:52 +09:00
Kohya S
e9f3a622f4
Merge branch 'dev' into lora-plus
2024-05-12 16:17:27 +09:00
Kohya S
7983d3db5f
Merge pull request #1319 from kohya-ss/fused-backward-pass
...
Fused backward pass
2024-05-12 15:09:39 +09:00
Kohya S
bee8cee7e8
update README for fused optimizer
2024-05-12 15:08:52 +09:00
Kohya S
f3d2cf22ff
update README for fused optimizer
2024-05-12 15:03:02 +09:00
Kohya S
6dbc23cf63
Merge branch 'dev' into fused-backward-pass
2024-05-12 14:21:56 +09:00
Kohya S
c1ba0b4356
update readme
2024-05-12 14:21:10 +09:00
Kohya S
607e041f3d
chore: Refactor optimizer group
2024-05-12 14:16:41 +09:00
AngelBottomless
793aeb94da
fix get_trainable_params in controlnet-llite training
2024-05-07 18:21:31 +09:00
Kohya S
b56d5f7801
add experimental option to fuse params to optimizer groups
2024-05-06 21:35:39 +09:00
Kohya S
017b82ebe3
update help message for fused_backward_pass
2024-05-06 15:05:42 +09:00
Kohya S
2a359e0a41
Merge pull request #1259 from 2kpr/fused_backward_pass
...
Adafactor fused backward pass and optimizer step, lowers SDXL (@ 1024 resolution) VRAM usage to BF16(10GB)/FP32(16.4GB)
2024-05-06 15:01:56 +09:00
Kohya S
3fd8cdc55d
fix dylora loraplus
2024-05-06 14:03:19 +09:00
Kohya S
7fe81502d0
update loraplus on dylora/lofa_fa
2024-05-06 11:09:32 +09:00
Kohya S
52e64c69cf
add debug log
2024-05-04 18:43:52 +09:00
Kohya S
58c2d856ae
support block dim/lr for sdxl
2024-05-03 22:18:20 +09:00
Dave Lage
8db0cadcee
Add caption_separator to output for subset
2024-05-02 18:08:28 -04:00
Dave Lage
dbb7bb288e
Fix caption_separator missing in subset schema
2024-05-02 17:39:35 -04:00
Kohya S
969f82ab47
move loraplus args from args to network_args, simplify log lr desc
2024-04-29 20:04:25 +09:00