this post was submitted on 05 Sep 2024
1 points (100.0% liked)

StableDiffusion

98 readers
1 users here now

/r/StableDiffusion is back open after the protest of Reddit killing open API access, which will bankrupt app developers, hamper moderation, and...

founded 1 year ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/stablediffusion by /u/Historical-Action-13 on 2024-09-04 21:40:08+00:00.


Early on, the possibility of model checkpoints (ckpt files) was discussed as a possible attack vector via "pickling".

This was quickly mitigated by moving to a safetensors format, and everyone now understands and abides by this best practice.

There's another elephant in the room.... the fact that we are downloading all kinds of new repos from unknown sources, and they frequently download dependencies both during install and some require internet access every time they are ran... either for some cloud service the author implemented with little thought, or to check for updates or download dependencies.

As an example, Auto1111 doesn't require internet after install, but it does every time you try to use a new controlnet or upscale you haven't used before.

This isn't suspicious and I trust Auto, but it becomes a security risk when you potentially have untrusted add-ons running alongside it in the same environment.

It should be standard practice for all repos to have an easy option to download all possible dependencies during the initial install so that they can then be walled off.

If developers want to implement cloud based tools with their repo that's fine, but it should be obtaining user consent before doing so, and it shouldn't cause a panic and break the software if there's no connection.

Furthermore, if for some reason a dependency wasn't downloaded and is now needed, the software should not panic and break because it couldn't be downloaded. Rather it should provide the user clear instructions on what model or dependency is needed, and what folder to place the file in so that it can be accessed locally.

Basically, just like we all expect safetensors and use pickles with caution, we should all expect offline installers to be available for all repos, and use ones that are not with caution.

Is this a big deal right now? Not really, and I personally am not that concerned about it yet.

I assume all the big name devs are honest and have no bad intentions, I'm more concerned about nodes, extensions, and standalone new repos.

As far as motive, you have the artists that hate AI who might maliciously bundle malware with new repos. You also have the potential for credential stealing malware or any other number of threats present in downloading any new experimental code or software.

Open source isn't a protection if nobody has had time to proofread the code yet.

We should be able to run these repos with a complete offline installer and be able to block their network access after without breaking them.

I want to emphasize I'm not currently accusing any devs of embedding malware, I just want us to get ahead of the potential threat the same way we did pickle files.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here