• @modeler@lemmy.world
    link
    fedilink
    English
    41
    edit-2
    4 months ago

    Typically you need about 1GB graphics RAM for each billion parameters (i.e. one byte per parameter). This is a 405B parameter model. Ouch.

    Edit: you can try quantizing it. This reduces the amount of memory required per parameter to 4 bits, 2 bits or even 1 bit. As you reduce the size, the performance of the model can suffer. So in the extreme case you might be able to run this in under 64GB of graphics RAM.

    • @cheddar@programming.dev
      link
      fedilink
      English
      214 months ago

      Typically you need about 1GB graphics RAM for each billion parameters (i.e. one byte per parameter). This is a 405B parameter model.

        • bruhduh
          link
          fedilink
          English
          14 months ago

          https://www.ebay.com/p/116332559 lga2011 motherboards quite cheap, insert 2 xeon 2696v4 44 threads each totalling at 88 threads and 8 ddr4 32gb sticks, it comes quite cheap actually, you can also install Nvidia p40 with 24gb each, you can max out this build for ai for under 2000$

      • chiisanaA
        link
        English
        24 months ago

        Finally! My dumb dumb 1TB ram server (4x E5-4640 + 32x32GB DDR3 ECC) can shine.

    • @Siegfried@lemmy.world
      link
      fedilink
      English
      8
      edit-2
      4 months ago

      At work we habe a small cluster totalling around 4TB of RAM

      It has 4 cooling units, a m3 of PSUs and it must take something like 30 m2 of space

    • TipRing
      link
      fedilink
      English
      44 months ago

      When the 8 bit quants hit, you could probably lease a 128GB system on runpod.

    • @1984@lemmy.today
      link
      fedilink
      English
      34 months ago

      Can you run this in a distributed manner, like with kubernetes and lots of smaller machines?

    • @obbeel@lemmy.eco.br
      link
      fedilink
      English
      24 months ago

      According to huggingface, you can run a 34B model using 22.4GBs of RAM max. That’s a RTX 3090 Ti.

    • @Longpork3@lemmy.nz
      link
      fedilink
      English
      1
      edit-2
      4 months ago

      Hmm, I probably have that much distributed across my network… maybe I should look into some way of distributing it across multiple gpu.

      Frak, just counted and I only have 270gb installed. Approx 40gb more if I install some of the deprecated cards in any spare pcie slots i can find.

    • arefx
      link
      fedilink
      English
      14 months ago

      Ypu mean my 4090 isn’t good enough 🤣😂