fur0ut0
8abb8645ae
add detail dataset config feature by extra config file ( #227 )
...
* add config file schema
* change config file specification
* refactor config utility
* unify batch_size to train_batch_size
* fix indent size
* use batch_size instead of train_batch_size
* make cache_latents configurable on subset
* rename options
* bucket_repo_range
* shuffle_keep_tokens
* update readme
* revert to min_bucket_reso & max_bucket_reso
* use subset structure in dataset
* format import lines
* split mode specific options
* use only valid subset
* change valid subsets name
* manage multiple datasets by dataset group
* update config file sanitizer
* prune redundant validation
* add comments
* update type annotation
* rename json_file_name to metadata_file
* ignore when image dir is invalid
* fix tag shuffle and dropout
* ignore duplicated subset
* add method to check latent cachability
* fix format
* fix bug
* update caption dropout default values
* update annotation
* fix bug
* add option to enable bucket shuffle across dataset
* update blueprint generate function
* use blueprint generator for dataset initialization
* delete duplicated function
* update config readme
* delete debug print
* print dataset and subset info as info
* enable bucket_shuffle_across_dataset option
* update config readme for clarification
* compensate quotes for string option example
* fix bug of bad usage of join
* conserve trained metadata backward compatibility
* enable shuffle in data loader by default
* delete resolved TODO
* add comment for image data handling
* fix reference bug
* fix undefined variable bug
* prevent raise overwriting
* assert image_dir and metadata_file validity
* add debug message for ignoring subset
* fix inconsistent import statement
* loosen too strict validation on float value
* sanitize argument parser separately
* make image_dir optional for fine tuning dataset
* fix import
* fix trailing characters in print
* parse flexible dataset config deterministically
* use relative import
* print supplementary message for parsing error
* add note about different methods
* add note of benefit of separate dataset
* add error example
* add note for english readme plan
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-03-01 20:58:08 +09:00
Kohya S
82707654ad
support sample generation in TI training
2023-02-28 22:05:31 +09:00
Kohya S
57c565c402
support sample generation in TI training
2023-02-28 22:05:10 +09:00
Kohya S
dd523c94ff
sample images in training (not fully tested)
2023-02-27 17:48:32 +09:00
Kohya S
a28f9ae7a3
support tokenizer caching for offline training/gen
2023-02-25 18:46:59 +09:00
Kohya S
9993792656
latents upscaling in highres fix, vae batch size
2023-02-25 18:17:18 +09:00
Kohya S
f0ae7eea95
Update README.md
v0.4.4
2023-02-23 21:59:20 +09:00
Kohya S
b22b0a5c75
Merge pull request #223 from kohya-ss/control_net
...
support ControlNet
2023-02-23 21:53:05 +09:00
Kohya S
c7a13c89c7
Merge branch 'main' into control_net
2023-02-23 21:51:03 +09:00
Kohya S
39a70f10bd
Merge pull request #222 from kohya-ss/dev
...
fix training instability issue, add metadata
2023-02-23 21:50:38 +09:00
Kohya S
a3c0e4cf44
update change history
2023-02-23 21:49:34 +09:00
Kohya S
9b13444b9c
raise error if options conflict
2023-02-23 21:35:47 +09:00
Kohya S
0eb01dea55
add max_grad_norm to metadata
2023-02-23 21:34:38 +09:00
Kohya S
a3aa3b1712
fix typos
2023-02-23 21:14:44 +09:00
Kohya S
95b5aed41b
Merge pull request #221 from space-nuko/add-more-metadata
...
Add more missing metadata
2023-02-23 21:14:26 +09:00
Kohya S
d9184ab21c
remove LoRA-ControlNet
2023-02-23 21:01:13 +09:00
Kohya S
e7dd77836d
Merge branch 'main' into control_net
2023-02-23 20:57:34 +09:00
Kohya S
4c5c486d28
Merge branch 'main' into dev
2023-02-23 20:57:17 +09:00
Kohya S
f403ac6132
fix float32 training doesn't work in some case
2023-02-23 20:56:41 +09:00
space-nuko
b39cf6e2c0
Add more missing metadata
2023-02-23 02:25:24 -08:00
Kohya S
71b728d5fc
Update README.md
2023-02-22 22:25:53 +09:00
Kohya S
f0ef81f865
Merge pull request #219 from kohya-ss/dev
...
Dev
2023-02-22 22:21:04 +09:00
Kohya S
f68a48b354
update readme
2023-02-22 22:19:36 +09:00
Kohya S
7a0d2a2d45
update readme
2023-02-22 22:16:23 +09:00
Kohya S
e13e503cbc
update readme
2023-02-22 22:10:32 +09:00
Kohya S
125039f491
update readme
2023-02-22 22:06:47 +09:00
Kohya S
f2b300a221
Add about optimizer
2023-02-22 22:04:53 +09:00
Kohya S
9ab964d0b8
Add Adafactor optimzier
2023-02-22 21:09:47 +09:00
Kohya S
663aad2b0d
refactor get_scheduler etc.
2023-02-20 22:47:43 +09:00
Kohya S
12d30afb39
Merge pull request #212 from mgz-dev/optimizer-expand-and-refactor
...
expand optimizer options and refactor
2023-02-20 20:13:41 +09:00
Kohya S
107fa754e5
Merge branch 'dev' into optimizer-expand-and-refactor
2023-02-20 20:12:42 +09:00
Kohya S
a17d1180cb
Merge pull request #209 from BootsofLagrangian/dadaptation
...
Dadaptation optimizer
2023-02-20 20:06:55 +09:00
Kohya S
014fd3d037
support original controlnet
2023-02-20 12:54:44 +09:00
mgz-dev
b29c5a750c
expand optimizer options and refactor
...
Refactor code to make it easier to add new optimizers, and support alternate optimizer parameters
-move redundant code to train_util for initializing optimizers
- add SGD Nesterov optimizers as option (since they are already available)
- add new parameters which may be helpful for tuning existing and new optimizers
2023-02-19 17:45:09 -06:00
unknown
b612d0b091
apply dadaptation
2023-02-19 19:07:26 +09:00
Kohya S
d94c0d70fe
support network mul from prompt
2023-02-19 18:43:35 +09:00
unknown
045a3dbe48
apply dadaptation
2023-02-19 18:37:07 +09:00
Kohya S
08ae46b163
Merge pull request #208 from space-nuko/add-optimizer-to-metadata
...
Add optimizer to metadata
v0.4.3
2023-02-19 17:21:57 +09:00
space-nuko
4e5db58a71
Add optimizer to metadata
2023-02-18 23:28:36 -08:00
Kohya S
e45e272e9d
Merge branch 'main' into control_net
2023-02-19 16:25:00 +09:00
Kohya S
a9d29ac78c
Merge pull request #207 from kohya-ss/dev
...
Dev
2023-02-19 15:29:40 +09:00
Kohya S
5c065eee79
update readme
2023-02-19 15:26:21 +09:00
Kohya S
048e7cd428
add lion optimizer support
2023-02-19 15:26:14 +09:00
Kohya S
a76ad2d1d5
add comment for future requirement update
2023-02-19 15:25:01 +09:00
Kohya S
9d0f9736bf
Merge pull request #202 from vladmandic/main
...
fix git path
2023-02-19 15:01:21 +09:00
Kohya S
00bb8a65a6
Merge pull request #200 from Isotr0py/lowram
...
Add '--lowram' argument
2023-02-19 14:32:32 +09:00
Vladimir Mandic
dac2bd163a
fix git path
2023-02-17 14:19:08 -05:00
Isotr0py
78d1fb5ce6
Add '--lowram' argument
2023-02-17 12:08:54 +08:00
Kohya S
14d7b24619
Merge pull request #198 from kohya-ss/dev
...
Dev
2023-02-16 22:35:47 +09:00
Kohya S
3bc0d83769
update readme
2023-02-16 22:21:51 +09:00