Commit Graph

70 Commits

Author SHA1 Message Date
Kohya S
9ab964d0b8 Add Adafactor optimzier 2023-02-22 21:09:47 +09:00
Kohya S
663aad2b0d refactor get_scheduler etc. 2023-02-20 22:47:43 +09:00
mgz-dev
b29c5a750c expand optimizer options and refactor
Refactor code to make it easier to add new optimizers, and support alternate optimizer parameters

-move redundant code to train_util for initializing optimizers
- add SGD Nesterov optimizers as option (since they are already available)
- add new parameters which may be helpful for tuning existing and new optimizers
2023-02-19 17:45:09 -06:00
Kohya S
048e7cd428 add lion optimizer support 2023-02-19 15:26:14 +09:00
Kohya S
43c0a69843 Add noise_offset 2023-02-14 21:15:48 +09:00
Kohya S
3a72e6f003 add tag dropout 2023-02-09 21:35:27 +09:00
Kohya S
e42b2f7aa9 conditional caption dropout (in progress) 2023-02-07 22:28:56 +09:00
forestsource
7db98baa86 Add dropout options 2023-02-07 00:01:30 +09:00
Kohya S
ea2dfd09ef update bucketing features 2023-02-05 21:37:46 +09:00
hitomi
26a81d075c add --persistent_data_loader_workers option 2023-02-01 16:02:15 +08:00
forestsource
5e817e4343 Add save_n_epoch_ratio 2023-01-22 03:00:28 +09:00
Kohya S
aa40cb9345 Add train epochs and max workers option to train 2023-01-15 13:07:47 +09:00
Kohya S
6b62c44022 fix errors in fine tuning 2023-01-08 21:40:40 +09:00
Kohya S
82e585cf01 Fix full_fp16 and clip_skip==2 is not working 2023-01-08 18:49:34 +09:00
Kohya S
f56988b252 unify dataset and save functions 2023-01-05 08:10:22 +09:00
Kohya S
4c35006731 split common function from train_network to util 2023-01-03 20:22:25 +09:00
Kohya S
6b522b34c1 move code for xformers to train_util 2023-01-02 16:08:21 +09:00
Yuta Hayashibe
0f20453997 Fix typos 2022-12-31 14:39:36 +09:00
bmaltais
c5cca931ab Proposed file structure rework and required file changes 2022-12-18 15:24:46 -05:00
Kohya S
7a04196e66 Add scripts. 2022-12-18 14:55:34 +09:00