load models to VRAM when using `--lowram` param
load models to VRM instead of RAM (for machines which have bigger VRM than RAM such as free Google Colab server)
Showing
Please
register
or
sign in
to comment
load models to VRM instead of RAM (for machines which have bigger VRM than RAM such as free Google Colab server)