rockerBOO
e8b3254858
Add flux_train_utils tests for get get_noisy_model_input_and_timesteps
2025-03-20 15:01:15 -04:00
rockerBOO
16cef81aea
Refactor sigmas and timesteps
2025-03-20 14:32:56 -04:00
Kohya S
d151833526
docs: update README with recent changes and specify version for pytorch-optimizer
2025-03-20 22:05:29 +09:00
Kohya S.
936d333ff4
Merge pull request #1985 from gesen2egee/pytorch-optimizer
...
Support pytorch_optimizer
2025-03-20 22:01:03 +09:00
rockerBOO
f974c6b257
change order to match upstream
2025-03-19 14:27:43 -04:00
rockerBOO
5d5a7d2acf
Fix IP noise calculation
2025-03-19 13:50:04 -04:00
rockerBOO
1eddac26b0
Separate random to a variable, and make sure on device
2025-03-19 00:49:42 -04:00
rockerBOO
8e6817b0c2
Remove double noise
2025-03-19 00:45:13 -04:00
rockerBOO
d93ad90a71
Add perturbation on noisy_model_input if needed
2025-03-19 00:37:27 -04:00
rockerBOO
7197266703
Perturbed noise should be separate of input noise
2025-03-19 00:25:51 -04:00
gesen2egee
5b210ad717
update prodigyopt and prodigy-plus-schedule-free
2025-03-19 10:49:06 +08:00
rockerBOO
b81bcd0b01
Move IP noise gamma to noise creation to remove complexity and align noise for target loss
2025-03-18 21:36:55 -04:00
rockerBOO
6f4d365775
zeros_like because we are adding
2025-03-18 18:53:34 -04:00
rockerBOO
a4f3a9fc1a
Use ones_like
2025-03-18 18:44:21 -04:00
rockerBOO
b425466e7b
Fix IP noise gamma to use random values
2025-03-18 18:42:35 -04:00
rockerBOO
c8be141ae0
Apply IP gamma to noise fix
2025-03-18 15:42:18 -04:00
rockerBOO
0b25a05e3c
Add IP noise gamma for Flux
2025-03-18 15:40:40 -04:00
rockerBOO
3647d065b5
Cache weight norms estimate on initialization. Move to update norms every step
2025-03-18 14:25:09 -04:00
Disty0
620a06f517
Check for uppercase file extension too
2025-03-17 17:44:29 +03:00
Disty0
564ec5fb7f
use extend instead of +=
2025-03-17 17:41:03 +03:00
Disty0
7e90cdd47a
use bytearray and add typing hints
2025-03-17 17:26:08 +03:00
gesen2egee
e5b5c7e1db
Update requirements.txt
2025-03-15 13:29:32 +08:00
rockerBOO
ea53290f62
Add LoRA-GGPO for Flux
2025-03-06 00:00:38 -05:00
Kohya S.
75933d70a1
Merge pull request #1960 from kohya-ss/sd3_safetensors_merge
...
Sd3 safetensors merge
2025-03-05 23:28:38 +09:00
Kohya S
aa2bde7ece
docs: add utility script for merging SD3 weights into a single .safetensors file
2025-03-05 23:24:52 +09:00
sdbds
3f49053c90
fatser fix bug for SDXL super SD1.5 assert cant use 32
2025-03-02 19:32:06 +08:00
Ivan Chikish
acdca2abb7
Fix [occasionally] missing text encoder attn modules
...
Should fix #1952
I added alternative name for CLIPAttention.
I have no idea why this name changed.
Now it should accept both names.
2025-03-01 20:35:45 +03:00
Kohya S
ba5251168a
fix: save tensors as is dtype, add save_precision option
2025-03-01 10:31:39 +09:00
Kohya S
272f4c3775
Merge branch 'sd3' into sd3_safetensors_merge
2025-02-28 23:52:36 +09:00
Kohya S
734333d0c9
feat: enhance merging logic for safetensors models to handle key prefixes correctly
2025-02-28 23:52:29 +09:00
Disty0
2f69f4dbdb
fix typo
2025-02-27 00:30:19 +03:00
Disty0
9a415ba965
JPEG XL support
2025-02-27 00:21:57 +03:00
Kohya S
3d79239be4
docs: update README to include recent improvements in validation loss calculation
2025-02-26 21:21:04 +09:00
Kohya S
ec350c83eb
Merge branch 'dev' into sd3
2025-02-26 21:17:29 +09:00
Kohya S.
49651892ce
Merge pull request #1903 from kohya-ss/val-loss-improvement
...
Val loss improvement
2025-02-26 21:15:14 +09:00
Kohya S
1fcac98280
Merge branch 'sd3' into val-loss-improvement
2025-02-26 21:09:10 +09:00
Kohya S.
b286304e5f
Merge pull request #1953 from Disty0/dev
...
Update IPEX libs
2025-02-26 21:03:09 +09:00
Kohya S
ae409e83c9
fix: FLUX/SD3 network training not working without caching latents closes #1954
2025-02-26 20:56:32 +09:00
Kohya S
5228db1548
feat: add script to merge multiple safetensors files into a single file for SD3
2025-02-26 20:50:58 +09:00
Kohya S
f4a0047865
feat: support metadata loading in MemoryEfficientSafeOpen
2025-02-26 20:50:44 +09:00
Disty0
f68702f71c
Update IPEX libs
2025-02-25 21:27:41 +03:00
Kohya S.
6e90c0f86c
Merge pull request #1909 from rockerBOO/progress_bar
...
Move progress bar to account for sampling image first
2025-02-24 18:57:44 +09:00
Kohya S
67fde015f7
Merge branch 'dev' into sd3
2025-02-24 18:56:15 +09:00
Kohya S.
386b7332c6
Merge pull request #1918 from tsukimiya/fix_vperd_warning
...
Remove v-pred warning.
2025-02-24 18:55:25 +09:00
Kohya S
905f081798
Merge branch 'dev' into sd3
2025-02-24 18:54:28 +09:00
Kohya S.
59ae9ea20c
Merge pull request #1945 from yidiq7/dev
...
Remove position_ids for V2
2025-02-24 18:53:46 +09:00
Kohya S
efb2a128cd
fix wandb val logging
2025-02-21 22:07:35 +09:00
Yidi
13df47516d
Remove position_ids for V2
...
The postions_ids cause errors for the newer version of transformer.
This has already been fixed in convert_ldm_clip_checkpoint_v1() but
not in v2.
The new code applies the same fix to convert_ldm_clip_checkpoint_v2().
2025-02-20 04:49:51 -05:00
rockerBOO
7f2747176b
Use resize_image where resizing is required
2025-02-19 14:20:40 -05:00
rockerBOO
ca1c129ffd
Fix metadata
2025-02-19 14:20:40 -05:00