• Captain Janeway@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    9
    ·
    5 months ago

    I think it makes sense. I like ChatGPT and I appreciate having easy access to it. What I really wish is the option to use local models instead. I realize most people don’t have machines that can tokenize quickly enough but for those that do…

      • ahal
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        5 months ago

        From the post:

        Whether it’s a local or a cloud-based model, if you want to use AI, we think you should have the freedom to use (or not use) the tools that best suit your needs

    • Blisterexe@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      From the post it seems like theyll ad support for self-hosted models before the feature leaves beta