Kohya S
6d6d86260b
add Deep Shrink
2023-11-23 19:40:48 +09:00
rockerBOO
c856ea4249
Add attention processor
2023-11-19 12:11:36 -05:00
Kohya S
d0923d6710
add caption_separator option
2023-11-19 21:44:52 +09:00
Kohya S
f312522cef
Merge pull request #913 from KohakuBlueleaf/custom-seperator
...
Add custom seperator for shuffle caption
2023-11-19 21:32:01 +09:00
xzuyn
da5a144589
Add PagedAdamW
2023-11-18 07:47:27 -05:00
Won-Kyu Park
2c1e669bd8
add min_diff, clamp_quantile args
...
based on https://github.com/bmaltais/kohya_ss/pull/1332 a9ec90c40a
2023-11-10 02:35:55 +09:00
Won-Kyu Park
e20e9f61ac
use **kwargs and change svd() calling convention to make svd() reusable
...
* add required attributes to model_org, model_tuned, save_to
* set "*_alpha" using str(float(foo))
2023-11-10 02:35:10 +09:00
feffy380
6b3148fd3f
Fix min-snr-gamma for v-prediction and ZSNR.
...
This fixes min-snr for vpred+zsnr by dividing directly by SNR+1.
The old implementation did it in two steps: (min-snr/snr) * (snr/(snr+1)), which causes division by zero when combined with --zero_terminal_snr
2023-11-07 23:02:25 +01:00
rockerBOO
569ca72fc4
Set grad enabled if is_train and train_text_encoder
...
We only want to be enabling grad if we are training.
2023-11-07 11:59:30 -05:00
rockerBOO
9c591bdb12
Remove unnecessary subset line from collate
2023-11-05 16:58:20 -05:00
rockerBOO
e545fdfd9a
Removed/cleanup a line
2023-11-05 16:56:36 -05:00
rockerBOO
c89252101e
Add process_batch for train_network
2023-11-05 16:27:36 -05:00
rockerBOO
a93c524b3a
Update args to validation_seed and validation_split
2023-11-05 12:37:44 -05:00
rockerBOO
3de9e6c443
Add validation split of datasets
2023-11-05 12:37:44 -05:00
rockerBOO
33c311ed19
new ratio code
2023-11-05 12:37:37 -05:00
rockerBOO
5b19bda85c
Add validation loss
2023-11-05 12:35:46 -05:00
Kohya S
95ae56bd22
Update README.md
2023-11-05 21:10:26 +09:00
Kohya S
990192d077
Merge pull request #927 from kohya-ss/dev
...
Dev
2023-11-05 19:31:41 +09:00
Kohya S
f3e69531c3
update readme
2023-11-05 19:30:52 +09:00
Kohya S
0cb3272bda
update readme
2023-11-05 19:26:35 +09:00
Kohya S
6231aa91e2
common lr logging, set default None to ddp_timeout
2023-11-05 19:09:17 +09:00
Kohaku-Blueleaf
489b728dbc
Fix typo again
2023-10-30 20:19:51 +08:00
Kohaku-Blueleaf
583e2b2d01
Fix typo
2023-10-30 20:02:04 +08:00
Kohaku-Blueleaf
5dc2a0d3fd
Add custom seperator
2023-10-30 19:55:30 +08:00
Yuta Hayashibe
2c731418ad
Added sample_images() for --sample_at_first
2023-10-29 22:08:42 +09:00
Yuta Hayashibe
5c150675bf
Added --sample_at_first description
2023-10-29 21:46:47 +09:00
Yuta Hayashibe
fea810b437
Added --sample_at_first to generate sample images before training
2023-10-29 21:44:57 +09:00
Kohya S
96d877be90
support separate LR for Text Encoder for SD1/2
2023-10-29 21:30:32 +09:00
Yuta Hayashibe
40d917b0fe
Removed incorrect comments
2023-10-29 21:02:44 +09:00
Kohya S
e72020ae01
update readme
2023-10-29 20:52:43 +09:00
Kohya S
01d929ee2a
support separate learning rates for TE1/2
2023-10-29 20:38:01 +09:00
Yuta Hayashibe
cf876fcdb4
Accept --ss to set sample_sampler dynamically
2023-10-29 20:15:04 +09:00
Yuta Hayashibe
291c29caaf
Added a function line_to_prompt_dict() and removed duplicated initializations
2023-10-29 19:57:25 +09:00
Yuta Hayashibe
01e00ac1b0
Make a function get_my_scheduler()
2023-10-29 19:46:02 +09:00
Kohya S
a9ed4ed8a8
Merge pull request #900 from xzuyn/paged_adamw_32bit
...
Add PagedAdamW32bit
2023-10-29 15:01:55 +09:00
Kohya S
9d6a5a0c79
Merge pull request #899 from shirayu/use_moving_average
...
Show moving average loss in the progress bar
2023-10-29 14:37:58 +09:00
Kohya S
fb97a7aab1
Merge pull request #898 from shirayu/update_repare_buckets_latents
...
Fix a typo and add assertions in making buckets
2023-10-29 14:29:53 +09:00
Kohaku-Blueleaf
1cefb2a753
Better implementation for te autocast ( #895 )
...
* Better implementation for te
* Fix some misunderstanding
* as same as unet, add explicit convert
* Better cache TE and TE lr
* Fix with list
* Add timeout settings
* Fix arg style
2023-10-28 15:49:59 +09:00
Yuta Hayashibe
63992b81c8
Fix initialize place of loss_recorder
2023-10-27 21:13:29 +09:00
xzuyn
d8f68674fb
Update train_util.py
2023-10-27 07:05:53 -04:00
Yuta Hayashibe
9d00c8eea2
Use LossRecorder
2023-10-27 18:31:36 +09:00
Yuta Hayashibe
0d21925bdf
Use @property
2023-10-27 18:14:27 +09:00
Yuta Hayashibe
efef5c8ead
Show "avr_loss" instead of "loss" because it is moving average
2023-10-27 17:59:58 +09:00
Yuta Hayashibe
3d2bb1a8f1
Add LossRecorder and use moving average in all places
2023-10-27 17:49:49 +09:00
Yuta Hayashibe
837a4dddb8
Added assertions
2023-10-26 13:34:36 +09:00
Yuta Hayashibe
b2626bc7a9
Fix a typo
2023-10-26 00:51:17 +09:00
青龍聖者@bdsqlsz
202f2c3292
Debias Estimation loss ( #889 )
...
* update for bnb 0.41.1
* fixed generate_controlnet_subsets_config for training
* Revert "update for bnb 0.41.1"
This reverts commit 70bd3612d8 .
* add debiased_estimation_loss
* add train_network
* Revert "add train_network"
This reverts commit 6539363c5c .
* Update train_network.py
2023-10-23 22:59:14 +09:00
Kohya S
2a23713f71
Merge pull request #872 from kohya-ss/dev
...
fix make_captions_by_git, improve image generation scripts
v0.7.0
2023-10-11 07:56:39 +09:00
Kohya S
681034d001
update readme
2023-10-11 07:54:30 +09:00
Kohya S
17813ff5b4
remove workaround for transfomers bs>1 close #869
2023-10-11 07:40:12 +09:00