Do you think I did something wrong? Are there some instructions I can type to check if everything is setup properly? Thank you!īeta Was this translation helpful? Give feedback. So it seems xformers are still much better for me. Set COMMANDLINE_ARGS=-xformers -opt-channelslast -autolaunch -theme dark -always-batch-cond-uncond -medvram -disable-safe-unpickle If instead I use xformers, with those commandline_args, I get around 4.1 it/s It seemed to install properly, but when I then run the UI, I get around only 3.1 it/s with my RTX 2060. Pip install torch torchvision -extra-index-url So I entered manually the following command in my command line to force installation : Set TORCH_COMMAND=pip install torch torchvision -extra-index-url īut when I launched the UI with this script, nothing happened (no install of PyTorch 2.0). Set COMMANDLINE_ARGS=-autolaunch -theme dark -opt-sdp-attention Pip install torch torchaudio torchvision triton -force -extra-index-url Īnd for launch flags, you can use either -opt-sdp-attention (usual) or -opt-sdp-no-mem-attention (which disables one part of sdp and makes it deterministic)īeta Was this translation helpful? Give off although there are no pre-compiled xformers for torch 2.0 yet, there is no real need for them anymore. best to modify /etc/ld.so.conf.d/* and rerun ldconfig so it creates unified system bindings. re: CUDA_PATH - just IMO, this can lead to self-inflicted issues as a) you may forget to set it sometimes, b) not all libs interpret that env variable consistently. That is only needed when using -upgrade instead of -force (force means force install of dependencies as well) Yes, torch 2.0 requires numpy 1.24 instead of usual 1.23, but its handled internally
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |