• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    1
    ·
    11 hours ago

    If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.

    • 1985MustangCobra
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 hours ago

      I have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.

        • 1985MustangCobra
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 hours ago

          interesting, well it’s something to look into, but id like a place to communicate with like minded people.