• circuscritic
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 month ago

    Llama is the model I use most often, followed by ChatGPT and Claude.

    Others as well, but yes, it is incredible helpful for the tasks I use it for.

      • circuscritic
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 month ago

        Yes and no, I have self-hosted models on one of my Linux boxes, but even with a relatively modern 70 series Nvidia GPU, it’s still faster to use free non-local services like ChatGPT or DDG.

        My rule of thumb for SaaS LLMs is to never enter in any data that I wouldn’t also be willing to upload cleartext to Google Drive or OneDrive.

        Sometimes that means modifying text before submitting it, and other times having to rely entirely on self-hosted tools.