Kohya S
57c565c402
support sample generation in TI training
2023-02-28 22:05:10 +09:00
Kohya S
dd523c94ff
sample images in training (not fully tested)
2023-02-27 17:48:32 +09:00
Kohya S
9ab964d0b8
Add Adafactor optimzier
2023-02-22 21:09:47 +09:00
Kohya S
663aad2b0d
refactor get_scheduler etc.
2023-02-20 22:47:43 +09:00
mgz-dev
b29c5a750c
expand optimizer options and refactor
...
Refactor code to make it easier to add new optimizers, and support alternate optimizer parameters
-move redundant code to train_util for initializing optimizers
- add SGD Nesterov optimizers as option (since they are already available)
- add new parameters which may be helpful for tuning existing and new optimizers
2023-02-19 17:45:09 -06:00
Kohya S
048e7cd428
add lion optimizer support
2023-02-19 15:26:14 +09:00
Kohya S
43c0a69843
Add noise_offset
2023-02-14 21:15:48 +09:00
Kohya S
188e54b760
support multiple init words
2023-02-11 15:00:11 +09:00
Kohya S
3a72e6f003
add tag dropout
2023-02-09 21:35:27 +09:00
Kohya S
e42b2f7aa9
conditional caption dropout (in progress)
2023-02-07 22:28:56 +09:00
Kohya S
ea2dfd09ef
update bucketing features
2023-02-05 21:37:46 +09:00
hitomi
26a81d075c
add --persistent_data_loader_workers option
2023-02-01 16:02:15 +08:00
Kohya S
25566182a8
Support newer traiing args
2023-01-26 21:37:14 +09:00
Kohya S
c1b14fcdd6
initial version of TI
2023-01-12 20:47:08 +09:00