this post was submitted on 04 Oct 2024
1 points (100.0% liked)

StableDiffusion

98 readers
1 users here now

/r/StableDiffusion is back open after the protest of Reddit killing open API access, which will bankrupt app developers, hamper moderation, and...

founded 1 year ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/stablediffusion by /u/isnaiter on 2024-10-03 20:14:51+00:00.


Going straight to the point, I fixed the Prodigy main issue. With my fix, you can train the Unet and TEs for as long as you want without frying the TEs and undertraining the Unet. To use it, just get the code I submitted in a PR on Prodigy’s GitHub. I don’t know if they’ll accept it, so you’ll probably have to manually replace it in the venv.

Edit: it's also possible to put a different LR in each network

About the loss modifier, I made it based on my limited knowledge of diffusion training and machine learning. It’s not perfect, it’s not the holy grail, but my trainings always turn out better when I use it.

Feel free to suggest ways to improve it.

For convenience, I replaced OneTrainer's min snr gamma function with my own, so all I need to do is activate msg and my function will take over.

I’m not going to post any examples here, but if anyone’s curious, I uploaded a training I did of my ugly face in the training results channel on the OT discord.

Edit:

To use the prodigy fix, get the prodigy.py here:

and put it in this folder:

C:\your-trainer-folder\OneTrainer\venv\Lib\site-packages\prodigyopt\

That's it, all the settings in OT stay the same, unless you want to set different LRs for each network, because that's possible now.

To use my custom loss modifier, get the ModelSetupDiffusionLossMixin.py here:

and put it in this folder:

C:\your-trainer-folder\OneTrainer\modules\modelSetup\mixin

Then in the OT's UI, select MIN_SNR_GAMMA in the Loss Weight Function on training tab, and insert any positive value other than 0.

The value itself doesn't matter, it's just to get OT to trigger the conditionals to use the min snr gamma function, which now has my function in place.

There was a typo in the function name in the loss modifier file, I fixed it now, it was missing an underline in the name.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here