Dropout and Max Norm Regularization for LoRA training (#545)

* Instantiate max_norm

* minor

* Move to end of step

* argparse

* metadata

* phrasing

* Sqrt ratio and logging

* fix logging

* Dropout test

* Dropout Args

* Dropout changed to affect LoRA only

---------

Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com>
This commit is contained in:
AI-Casanova
2023-06-01 00:58:38 -05:00
committed by GitHub
parent 5931948adb
commit 9c7237157d
4 changed files with 77 additions and 9 deletions

View File

@@ -3638,4 +3638,4 @@ class collater_class:
# set epoch and step
dataset.set_current_epoch(self.current_epoch.value)
dataset.set_current_step(self.current_step.value)
return examples[0]
return examples[0]