Jump to content

Recommended Posts

Posted (edited)
  On 5/6/2025 at 6:40 PM, DlinnyLag said:

Next question to DeepSeek: "how to run you locally?" %)

Expand  

I've already tried. The required minimum is rtx3090 (this is with 4-bit quantization). The most budget (without quantization) option that I found. Atlas 300v pro. It is ~3 times weaker than a100 (but 10 times cheaper), and it has 48gb Vram on board, which allows you to run 33b without terrible disappointment from comparison with cloud v3. 300v pro - 256tflops (for comparison, a100 - 624tflops, rtx3090 - 40tflops). But the price of rtx3090 = the price of atlas 300v pro. Naturally, there is no video output. You will have to keep 2 cards in your PC, and do not forget that you will need another ~128 gigabytes of RAM. In general, AI is not a pleasure for the proletarian.

Edited by South8028
  • Recently Browsing   0 members

    • No registered users viewing this page.
ร—
ร—
  • Create New...