Kohya S
6edbe00547
feat: update libraries, remove warnings
2025-08-16 20:07:03 +09:00
Yuta Hayashibe
5f6bf29e52
Replace print with logger if they are logs ( #905 )
...
* Add get_my_logger()
* Use logger instead of print
* Fix log level
* Removed line-breaks for readability
* Use setup_logging()
* Add rich to requirements.txt
* Make simple
* Use logger instead of print
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2024-02-04 18:14:34 +09:00
Aarni Koskela
ef50436464
Fix typo --spda (it's --sdpa)
2024-01-16 14:32:48 +02:00
Kohya S
46cf41cc93
Merge pull request #961 from rockerBOO/attention-processor
...
Add attention processor
2023-12-03 21:24:12 +09:00
Kohya S
c61e3bf4c9
make separate U-Net for inference
2023-11-26 18:11:30 +09:00
Kohya S
6d6d86260b
add Deep Shrink
2023-11-23 19:40:48 +09:00
rockerBOO
c856ea4249
Add attention processor
2023-11-19 12:11:36 -05:00
Yuta Hayashibe
27f9b6ffeb
updated typos to v1.16.15 and fix typos
2023-10-01 21:51:24 +09:00
Kohya S
11e8c7d8ff
fix to work controlnet training
2023-06-24 09:35:33 +09:00
ykume
0dfffcd88a
remove unnecessary import
2023-06-11 21:46:05 +09:00
ykume
9e1683cf2b
support sdpa
2023-06-11 21:26:15 +09:00
ykume
035dd3a900
fix mem_eff_attn does not work
2023-06-11 17:08:21 +09:00
ykume
7f6b581ef8
support memory efficient attn (not xformers)
2023-06-11 16:54:41 +09:00
Kohya S
dccdb8771c
support sample generation in training
2023-06-07 08:12:52 +09:00
Kohya S
c0a7df9ee1
fix eps value, enable xformers, etc.
2023-06-03 21:29:27 +09:00
Kohya S
5db792b10b
initial commit for original U-Net
2023-06-03 19:24:47 +09:00