Kohya S
2b07a92c8d
Fix error in applying mask in Attention and add LoRA converter script
2024-08-21 12:30:23 +09:00
Kohya S
e17c42cb0d
Add BFL/Diffusers LoRA converter #1467 #1458 #1483
2024-08-21 12:28:45 +09:00
Kohya S
dbed5126bd
chore: formatting
2024-08-20 19:33:47 +09:00
Kohya S
9381332020
revert merge function add add option to use new func
2024-08-20 19:32:26 +09:00
Kohya S
6f6faf9b5a
fix to work with ai-toolkit LoRA
2024-08-20 19:16:25 +09:00
exveria1015
7e688913ae
fix: Flux の LoRA マージ機能を修正
2024-08-18 12:38:05 +09:00
Kohya S
e45d3f8634
add merge LoRA script
2024-08-16 22:19:21 +09:00
Kohya S
56d7651f08
add experimental split mode for FLUX
2024-08-13 22:28:39 +09:00
Kohya S
8a0f12dde8
update FLUX LoRA training
2024-08-10 23:42:05 +09:00
Kohya S
358f13f2c9
fix alpha is ignored
2024-08-10 14:03:59 +09:00
Kohya S
36b2e6fc28
add FLUX.1 LoRA training
2024-08-09 22:56:48 +09:00
Kohya S
1a104dc75e
make forward/backward pathes same ref #1363
2024-06-09 19:26:36 +09:00
rockerBOO
00513b9b70
Add LoRA+ LR Ratio info message to logger
2024-05-23 22:27:12 -04:00
Kohya S
146edce693
support Diffusers' based SDXL LoRA key for inference
2024-05-18 11:05:04 +09:00
Kohya S
16677da0d9
fix create_network_from_weights doesn't work
2024-05-12 22:15:07 +09:00
Kohya S
44190416c6
update docs etc.
2024-05-12 17:01:20 +09:00
Kohya S
3c8193f642
revert lora+ for lora_fa
2024-05-12 17:00:51 +09:00
Kohya S
3fd8cdc55d
fix dylora loraplus
2024-05-06 14:03:19 +09:00
Kohya S
7fe81502d0
update loraplus on dylora/lofa_fa
2024-05-06 11:09:32 +09:00
Kohya S
58c2d856ae
support block dim/lr for sdxl
2024-05-03 22:18:20 +09:00
Kohya S
969f82ab47
move loraplus args from args to network_args, simplify log lr desc
2024-04-29 20:04:25 +09:00
Kohya S
834445a1d6
Merge pull request #1233 from rockerBOO/lora-plus
...
Add LoRA+ support
2024-04-29 18:05:12 +09:00
rockerBOO
68467bdf4d
Fix unset or invalid LR from making a param_group
2024-04-11 17:33:19 -04:00
rockerBOO
75833e84a1
Fix default LR, Add overall LoRA+ ratio, Add log
...
`--loraplus_ratio` added for both TE and UNet
Add log for lora+
2024-04-08 19:23:02 -04:00
rockerBOO
1933ab4b48
Fix default_lr being applied
2024-04-03 12:46:34 -04:00
Kohya S
b748b48dbb
fix attention couple+deep shink cause error in some reso
2024-04-03 12:43:08 +09:00
rockerBOO
c7691607ea
Add LoRA-FA for LoRA+
2024-04-01 15:43:04 -04:00
rockerBOO
f99fe281cb
Add LoRA+ support
2024-04-01 15:38:26 -04:00
Yuta Hayashibe
5d5f39b6e6
Replaced print with logger
2024-02-25 01:24:11 +09:00
Kohya S
8b7c14246a
some log output to print
2024-02-24 20:50:00 +09:00
Kohya S
52b3799989
fix format, add new conv rank to metadata comment
2024-02-24 20:49:41 +09:00
Kohya S
0e703608f9
Merge branch 'dev' into resize_lora-add-rank-for-conv
2024-02-24 20:09:38 +09:00
Kohya S
fb9110bac1
format by black
2024-02-24 20:00:57 +09:00
Kohya S
86279c8855
Merge branch 'dev' into DyLoRA-xl
2024-02-24 19:18:36 +09:00
tamlog06
a6f1ed2e14
fix dylora create_modules error
2024-02-18 13:20:47 +00:00
Kohya S
baa0e97ced
Merge branch 'dev' into dev_device_support
2024-02-17 11:54:07 +09:00
Kohya S
cbe9c5dc06
supprt deep shink with regional lora, add prompter module
2024-02-12 14:17:27 +09:00
Kohya S
358ca205a3
Merge branch 'dev' into dev_device_support
2024-02-12 13:01:54 +09:00
Yuta Hayashibe
5f6bf29e52
Replace print with logger if they are logs ( #905 )
...
* Add get_my_logger()
* Use logger instead of print
* Fix log level
* Removed line-breaks for readability
* Use setup_logging()
* Add rich to requirements.txt
* Make simple
* Use logger instead of print
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2024-02-04 18:14:34 +09:00
mgz
1492bcbfa2
add --new_conv_rank option
...
update script to also take a separate conv rank value
2024-02-03 23:18:55 -06:00
mgz
bf2de5620c
fix formatting in resize_lora.py
2024-02-03 20:09:37 -06:00
Disty0
a6a2b5a867
Fix IPEX support and add XPU device to device_utils
2024-01-31 17:32:37 +03:00
Kohya S
2ca4d0c831
Merge pull request #1054 from akx/mps
...
Device support improvements (MPS)
2024-01-31 21:30:12 +09:00
mgz
d4b9568269
fix broken import in svd_merge_lora script
...
remove missing import, and remove unused imports
2024-01-28 11:59:07 -06:00
Aarni Koskela
478156b4f7
Refactor device determination to function; add MPS fallback
2024-01-23 14:29:03 +02:00
Kohya S
c59249a664
Add options to reduce memory usage in extract_lora_from_models.py closes #1059
2024-01-20 18:45:54 +09:00
Kohya S
0fb9ecf1f3
format by black, add ja comment
2023-11-25 21:05:55 +09:00
Won-Kyu Park
2c1e669bd8
add min_diff, clamp_quantile args
...
based on https://github.com/bmaltais/kohya_ss/pull/1332 a9ec90c40a
2023-11-10 02:35:55 +09:00
Won-Kyu Park
e20e9f61ac
use **kwargs and change svd() calling convention to make svd() reusable
...
* add required attributes to model_org, model_tuned, save_to
* set "*_alpha" using str(float(foo))
2023-11-10 02:35:10 +09:00
Kohya S
8b79e3b06c
fix typos
2023-10-09 18:00:45 +09:00