mgz-dev
b29c5a750c
expand optimizer options and refactor
...
Refactor code to make it easier to add new optimizers, and support alternate optimizer parameters
-move redundant code to train_util for initializing optimizers
- add SGD Nesterov optimizers as option (since they are already available)
- add new parameters which may be helpful for tuning existing and new optimizers
2023-02-19 17:45:09 -06:00
Kohya S
048e7cd428
add lion optimizer support
2023-02-19 15:26:14 +09:00
Kohya S
ffdfd5f615
fix name of loss for epoch
2023-02-16 22:21:36 +09:00
Kohya S
914d1505df
Merge pull request #189 from shirayu/improve_loss_track
...
Show the moving average loss
2023-02-16 22:00:26 +09:00
Kohya S
43c0a69843
Add noise_offset
2023-02-14 21:15:48 +09:00
Yuta Hayashibe
8aed5125de
Removed call of sum()
2023-02-14 21:11:30 +09:00
Yuta Hayashibe
21f5b618c3
Show the moving average loss
2023-02-14 19:46:27 +09:00
Kohya S
3a72e6f003
add tag dropout
2023-02-09 21:35:27 +09:00
Kohya S
e42b2f7aa9
conditional caption dropout (in progress)
2023-02-07 22:28:56 +09:00
forestsource
7db98baa86
Add dropout options
2023-02-07 00:01:30 +09:00
Kohya S
ea2dfd09ef
update bucketing features
2023-02-05 21:37:46 +09:00
hitomi
26a81d075c
add --persistent_data_loader_workers option
2023-02-01 16:02:15 +08:00
forestsource
5e817e4343
Add save_n_epoch_ratio
2023-01-22 03:00:28 +09:00
Kohya S
687044519b
Fix TE training stops at max steps if ecpochs set
2023-01-19 21:43:34 +09:00
Kohya S
aa40cb9345
Add train epochs and max workers option to train
2023-01-15 13:07:47 +09:00
Kohya S
82e585cf01
Fix full_fp16 and clip_skip==2 is not working
2023-01-08 18:49:34 +09:00
Kohya S
80af4c0c42
Set dtype if text encoder is not trained at all
2023-01-07 21:43:27 +09:00
Kohya S
2efced0a9a
fix training starts with debug_dataset
2023-01-07 20:19:25 +09:00
Kohya S
1b222dbf9b
erase using of deleted property
2023-01-06 17:13:56 +09:00
Kohya S
f56988b252
unify dataset and save functions
2023-01-05 08:10:22 +09:00
Kohya S
4c35006731
split common function from train_network to util
2023-01-03 20:22:25 +09:00
Kohya S
e31177adf3
Merge branch 'refactoring_training' of https://github.com/kohya-ss/sd-scripts into refactoring_training
2023-01-02 16:14:45 +09:00
Kohya S
6b522b34c1
move code for xformers to train_util
2023-01-02 16:08:21 +09:00
Yuta Hayashibe
85d8b49129
Fix calculation for the old epoch
2023-01-01 23:36:20 +09:00
Yuta Hayashibe
61a61c51ee
Add --save_last_n_epochs option
2023-01-01 21:46:38 +09:00
Yuta Hayashibe
0f20453997
Fix typos
2022-12-31 14:39:36 +09:00
Kohya S
0a884da984
Fix text encoder training is not stopped #4
2022-12-22 23:34:51 +09:00
bmaltais
c5cca931ab
Proposed file structure rework and required file changes
2022-12-18 15:24:46 -05:00