• Ulu-Mulu-no-die@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Leaking industry secrets is a much bigger concern that boosting productivity a little bit.

    We’re talking about very specialized engineering work, it’s not something you can totally rely on a bot to do, though it might help sometimes, it’s fully understandable for specialized companies to want to ban GPT internally, until there’s a way for them to host a totally internal one.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      On this I agree entirely. The potential for corporate espionage because of unwitting employees using an LLM through unofficial means is huge.

      At the very least, the corporation itself would have to be the customer, so that watertight terms might be negotiated, not the employee.

      • Ulu-Mulu-no-die@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I don’t think being a customer would work either, language models are still on the training, noone knows exactly how users queries are used, that’s a big no no for every company having to protect their secrets.

        A self-hosted instance is a much better solution, in not the only “safe” one from that point of view, we’ll get there.