• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "ollama". Back to normal view
    1. Cheap options(?) to run local AI models

      I have been having fun learning about generative AI. All in the cloud -- I got some models on hugging face to work, tried out Colab Pro, and found another cloud provider that runs SD models...

      I have been having fun learning about generative AI. All in the cloud -- I got some models on hugging face to work, tried out Colab Pro, and found another cloud provider that runs SD models (dreamlook.ai if anyone is interested).

      It's got me curious about trying to run something locally (mostly stable diffusion/dreambooth, possibly ollama).
      I currently have a Thinkpad T490 with 16 gb ram and the base-level graphics card. I haven't actually tried to run anything locally, on the assumption that it would be extremely slow. I saw that you can get an external GPU, though I also saw some reports of headaches trying to get external GPUs up and running.

      I am curious what a workstation might cost that could do a reasonable job running local models. I am not a huge gamer or have any other high performance needs that are not currently served by the Thinkpad; not sure I can justify a $3000 workstation just to make a few jpgs.

      I would be happy to buy something secondhand, like if there was a good source of off-lease workstations.

      Alternatively-- if you have a similar computer to the T490 and do run models locally, what sort of performance is reasonable to expect? Would it be enough to buy some more RAM for this laptop?

      Thanks for any advice!

      13 votes