• panda_abyss
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    I plugged my local AI into offline wikipedia expecting a source of truth to make it way way better.

    It’s better, but I also can’t tell when it’s making up citations now, because it uses Wikipedia to support its own world view from pre training instead of reality.

    So it’s not really much better.

    Hallucinations become a bigger problem the more info they have (that you now have to double check)

    • FlashMobOfOne@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 days ago

      At my work, we don’t allow it to make citations. We instruct it to add in placeholders for citations instead, which allows us to hunt down the info, ensure it’s good info, and then add it in ourselves.

        • FlashMobOfOne@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          Yup.

          In some instances that’s sufficient though, depending on how much precision you need for what you do. Regardless, you have to review it no matter what it produces.

      • panda_abyss
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        That probably makes sense.

        I haven’t played around since the initial shell shock of “oh god it’s worse now”