woctordho
1cd95b2d8b
Add skip_image_resolution to deduplicate multi-resolution dataset ( #2273 )
...
* Add min_orig_resolution and max_orig_resolution
* Rename min_orig_resolution to skip_image_resolution; remove max_orig_resolution
* Change skip_image_resolution to tuple
* Move filtering to __init__
* Minor fix
2026-03-19 08:43:39 +09:00
rockerBOO
58e9e146a3
Add resize interpolation configuration
2025-02-19 14:20:40 -05:00
rockerBOO
264167fa16
Apply is_training_dataset only to DreamBoothDataset. Add validation_split check and warning
2025-01-09 12:43:58 -05:00
rockerBOO
9fde0d7972
Handle tuple return from generate_dataset_group_by_blueprint
2025-01-08 18:38:20 -05:00
rockerBOO
556f3f1696
Fix documentation, remove unused function, fix bucket reso for sd1.5, fix multiple datasets
2025-01-08 13:41:15 -05:00
rockerBOO
0522070d19
Fix training, validation split, revert to using upstream implemenation
2025-01-03 15:20:25 -05:00
rockerBOO
c8c3569df2
Cleanup order, types, print to logger
2025-01-03 01:26:45 -05:00
rockerBOO
534059dea5
Typos and lingering is_train
2025-01-03 01:18:15 -05:00
rockerBOO
d23c7322ee
Merge remote-tracking branch 'hina/feature/val-loss' into validation-loss-upstream
...
Modified implementation for process_batch and cleanup validation
recording
2025-01-03 00:48:08 -05:00
rockerBOO
7f6e124c7c
Merge branch 'gesen2egee/val' into validation-loss-upstream
...
Modified various implementations to restore original behavior
2025-01-02 23:04:38 -05:00
rockerBOO
449c1c5c50
Adding modified train_util and config_util
2025-01-02 15:59:20 -05:00
gesen2egee
8743532963
val
2025-01-02 15:57:12 -05:00
Hina Chen
05bb9183fa
Add Validation loss for LoRA training
2024-12-27 16:47:59 +08:00
Kohya S
5e86323f12
Update README and clean-up the code for SD3 timesteps
2024-11-07 21:27:12 +09:00
Kohya S
3cc5b8db99
Diff Output Preserv loss for SDXL
2024-10-18 20:57:13 +09:00
Kohya S
41dee60383
Refactor caching mechanism for latents and text encoder outputs, etc.
2024-07-27 13:50:05 +09:00
Kohya S
e8cfd4ba1d
fix to work cond mask and alpha mask
2024-05-26 22:01:37 +09:00
Kohya S
da6fea3d97
simplify and update alpha mask to work with various cases
2024-05-19 21:26:18 +09:00
u-haru
db6752901f
画像のアルファチャンネルをlossのマスクとして使用するオプションを追加 ( #1223 )
...
* Add alpha_mask parameter and apply masked loss
* Fix type hint in trim_and_resize_if_required function
* Refactor code to use keyword arguments in train_util.py
* Fix alpha mask flipping logic
* Fix alpha mask initialization
* Fix alpha_mask transformation
* Cache alpha_mask
* Update alpha_masks to be on CPU
* Set flipped_alpha_masks to Null if option disabled
* Check if alpha_mask is None
* Set alpha_mask to None if option disabled
* Add description of alpha_mask option to docs
2024-05-19 19:07:25 +09:00
Kohya S
a384bf2187
Merge pull request #1313 from rockerBOO/patch-3
...
Add caption_separator to output for subset
2024-05-12 21:36:56 +09:00
Dave Lage
8db0cadcee
Add caption_separator to output for subset
2024-05-02 18:08:28 -04:00
Dave Lage
dbb7bb288e
Fix caption_separator missing in subset schema
2024-05-02 17:39:35 -04:00
gesen2egee
fde8026c2d
Update config_util.py
2024-04-11 11:29:26 +08:00
gesen2egee
459b12539b
Update config_util.py
2024-04-11 01:52:14 +08:00
gesen2egee
3b251b758d
Update config_util.py
2024-04-11 01:50:32 +08:00
gesen2egee
36d4023431
Update config_util.py
2024-04-11 01:39:17 +08:00
gesen2egee
086f6000f2
Merge branch 'main' into val
2024-04-11 01:14:46 +08:00
Kohya S
c86e356013
Merge branch 'dev' into dataset-cache
2024-03-26 19:43:40 +09:00
Kohya S
025347214d
refactor metadata caching for DreamBooth dataset
2024-03-24 18:09:32 +09:00
Kohaku-Blueleaf
ae97c8bfd1
[Experimental] Add cache mechanism for dataset groups to avoid long waiting time for initilization ( #1178 )
...
* support meta cached dataset
* add cache meta scripts
* random ip_noise_gamma strength
* random noise_offset strength
* use correct settings for parser
* cache path/caption/size only
* revert mess up commit
* revert mess up commit
* Update requirements.txt
* Add arguments for meta cache.
* remove pickle implementation
* Return sizes when enable cache
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2024-03-24 15:40:18 +09:00
Kohya S
3419c3de0d
common masked loss func, apply to all training script
2024-03-17 19:30:20 +09:00
gesen2egee
b5e8045df4
fix control net
2024-03-16 11:51:41 +08:00
gesen2egee
5d7ed0dff0
Merge remote-tracking branch 'kohya-ss/dev' into val
2024-03-13 18:00:49 +08:00
gesen2egee
7d84ac2177
only use train subset to val
2024-03-11 14:41:51 +08:00
gesen2egee
78cfb01922
improve
2024-03-10 18:55:48 +08:00
gesen2egee
b558a5b73d
val
2024-03-10 04:37:16 +08:00
Kohya S
f2c727fc8c
add minimal impl for masked loss
2024-02-26 23:19:58 +09:00
Kohya S
577e9913ca
add some new dataset settings
2024-02-26 20:01:25 +09:00
Yuta Hayashibe
5f6bf29e52
Replace print with logger if they are logs ( #905 )
...
* Add get_my_logger()
* Use logger instead of print
* Fix log level
* Removed line-breaks for readability
* Use setup_logging()
* Add rich to requirements.txt
* Make simple
* Use logger instead of print
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2024-02-04 18:14:34 +09:00
Kohya S
fef172966f
Add network_multiplier for dataset and train LoRA
2024-01-20 16:24:43 +09:00
Kohya S
5a1ebc4c7c
format by black
2024-01-20 13:10:45 +09:00
Furqanil Taqwa
4a913ce61e
initialize keep_tokens_separator to dataset config
2023-11-28 17:22:35 +07:00
rockerBOO
3de9e6c443
Add validation split of datasets
2023-11-05 12:37:44 -05:00
Kohaku-Blueleaf
489b728dbc
Fix typo again
2023-10-30 20:19:51 +08:00
Kohaku-Blueleaf
5dc2a0d3fd
Add custom seperator
2023-10-30 19:55:30 +08:00
青龍聖者@bdsqlsz
d5be8125b0
update bitsandbytes for 0.41.1 and fixed bugs with generate_controlnet_subsets_config for training ( #823 )
...
* update for bnb 0.41.1
* fixed generate_controlnet_subsets_config for training
* Revert "update for bnb 0.41.1"
This reverts commit 70bd3612d8 .
2023-09-24 10:51:47 +09:00
Kohya S
948cf17499
add caption_prefix/suffix to dataset
2023-09-02 16:17:12 +09:00
Kohya S
ce46aa0c3b
remove debug print
2023-07-04 21:34:18 +09:00
Kohya S
71a6d49d06
fix to work train_network with fine-tuning dataset
2023-06-28 07:50:53 +09:00
Kohya S
9e9df2b501
update dataset to return size, refactor ctrlnet ds
2023-06-24 17:56:02 +09:00