Importing more optimizer libraries

This commit is contained in:
Darren Laurie
2025-03-15 17:23:32 +08:00
parent 91e3ff173e
commit 4a3ced5fb9
2 changed files with 2 additions and 1 deletions

View File

@@ -3599,7 +3599,7 @@ def add_optimizer_arguments(parser: argparse.ArgumentParser):
"Lion8bit, PagedLion8bit, Lion, SGDNesterov, SGDNesterov8bit, "
"DAdaptation(DAdaptAdamPreprint), DAdaptAdaGrad, DAdaptAdam, DAdaptAdan, DAdaptAdanIP, DAdaptLion, DAdaptSGD, "
"AdaFactor. "
"Also, you can use any optimizer by specifying the full path to the class, like 'bitsandbytes.optim.AdEMAMix8bit' or 'bitsandbytes.optim.PagedAdEMAMix8bit'.",
"Also, you can use any optimizer by specifying the full path to the class, like 'bitsandbytes.optim.AdEMAMix8bit' or 'bitsandbytes.optim.PagedAdEMAMix8bit', or 'pytorch_optimizer.CAME'.",
)
# backward compatibility

View File

@@ -7,6 +7,7 @@ opencv-python==4.8.1.78
einops==0.7.0
pytorch-lightning==1.9.0
bitsandbytes==0.44.0
pytorch-optimizer==3.4.2
prodigyopt==1.0
lion-pytorch==0.0.6
schedulefree==1.4