Fix the LoRA dropout issue in the Anima model during LoRA training. (#2272)

* Support network_reg_alphas and fix bug when setting rank_dropout in training lora for anima model

* Update anima_train_network.md

* Update anima_train_network.md

* Remove network_reg_alphas

* Update document
This commit is contained in:
duongve13112002
2026-02-23 13:13:40 +07:00
committed by GitHub
parent 48d368fa55
commit 609d1292f6
3 changed files with 10 additions and 7 deletions

View File

@@ -652,4 +652,4 @@ The following metadata is saved in the LoRA model file:
* `ss_sigmoid_scale`
* `ss_discrete_flow_shift`
</details>
</details>