I’m now convinced that most holodeck malfunctions are the result of end users, who don’t know what they’re doing, using AI to generate poorly-written software they’re ill-equipped to debug or even really understand.

  • Flying Squid@lemmy.worldM
    link
    fedilink
    arrow-up
    5
    ·
    5 months ago

    Really just one person- Geordie- through an accidental misphrasing of a request to the computer.

    I would have never used that computer again. Or at least given it a complete overhaul. You shouldn’t be allowed to do the sort of thing you’d request of Dall-E in order for the computer to create intelligent life.

    • sundray@lemmus.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      Yeah, they could stand to at least add a, “This request will result in the creation a sentient being, are you sure you wish to proceed?” warning.

      • Flying Squid@lemmy.worldM
        link
        fedilink
        arrow-up
        5
        ·
        5 months ago

        Really, a lot of Star Trek problems could be averted with a “are you sure you want to do this?” before someone does something with the computer or the holodeck. Starfleet apparently never learns. That’s why in Prodigy-

        spoiler

        Janeway goes back to the Delta Quadrent in a different ship of a different but similar-looking class renamed Voyager.

    • JWBananas@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Frankly I would posit that present-day LLMs demonstrate exactly why Moriarty wasn’t even necessarily sapient.