I’m interested in automatically generating lengthy, coherent stories of 100,000+ words from a single prompt using an open source local large language model (LLM). I came across the “Awesome-Story-Generation” repository which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM, I would greatly appreciate any advice or guidance.

  • BlameThePeacock
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    You think Humans aren’t pumping out 100% derivative works all the time?

    Like every shitty romance novel published. There’s only so many ways a man can woo a woman, they just change the location, randomize the set of actions from a list of things men can do to turn women on, throw in something to harm the relationship, and then come up with a set of names.

    • Deestan@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      You think Humans aren’t pumping out 100% derivative works all the time?

      Don’t worry. I don’t think that.

      A big hope I have for AI is that 100% derivative work by humans is now easier to call out. If a rock with a 9V battery could produce it, why should we value it?

      • BlameThePeacock
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 hours ago

        We value tons of things produced by rocks we run electricity through, why is this any different than a car that was produced using a robot welder or a house constructed with a crane?