![]() ![]() Uses an older version of the optimization above that is not as memory hungry (it will use less VRAM, but will be more limiting in the maximum size of pictures you can make). On macOS, this will also allow for generation of larger images. Recommended if getting poor performance or failed generations with a hardware/software configuration that xFormers doesn't work for. ![]() Sub-quadratic attention, a memory efficient Cross Attention layer optimization that can significantly reduce required memory, sometimes at a slight performance cost. On by default for torch.cuda, which includes both NVidia and AMD cards. Do not report bugs you get running this.Ĭross attention layer optimization significantly reducing memory use for almost no cost (some report improved performance with it). (non-deterministic)Įnables xFormers regardless of whether the program thinks you can run it or not. Great improvement to memory consumption and speed. (deterministic, slightly slower than -opt-sdp-attention and uses more VRAM) May results in faster speeds than using xFormers on some systems but requires more VRAM. A number of optimization can be enabled by commandline arguments: commandline argument ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |