Yes, let’s keep growing our group here! We’ve been getting new faces/bodies in the OC communities, and I’d like to hear from them here, too :-)

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Thinking about and playing with ways of writing a LLM roleplaying story that prompts the user to choose between sending interstellar colony ships with 3k people or convincing them to send a colony ship with only 325 genetically selected humans on a 10 year contract to get the population past 3k. Then forcing them to discover the social and moral implications as they discover you can’t form relationships but must coexist in close confines with an AGI telling you who you must partner with for 10 years. I don’t know if I can warp it to mostly interesting philosophical with a bit of fun. It’s mostly fun but forgotten philosophy.

    • lazyneet@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Interesting. Do you plan to ship the LLM with the game or dial up a remote server? Presumably you know how to make the model talk dirty?

      • j4k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Just doing stuff with offline open source LLMs. Oobabooga Textgen WebUI, KoboldCpp, Python scripts for a model loader (tokenizer), models from huggingface.co. There are plenty of NSFW models, but you need to run the larger ones for complex fun. I use either a 70B or a 8×7B model, but you’ll need enthusiast level hardware for those like i7 12th gen or greater, and 16GB+ GPU. You’ll also need 64GB+ of system memory. There are smaller models that run on lower specs but they tend go really struggle with complexity and highly constrained stories like this.

        • lazyneet@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Interesting. I managed to run a llama model on the cpu of 3 different machines. It did ok, but it had 10% of the complexity of what you’re talking about. I don’t have enough faith in Moore’s Law that we’ll all have machines that can run your game, nor internet bandwidth to download it, any time soon. Best bet would be physical media and dedicated hardware like ps5, but then players would expect something more than text. If you wanted to share this with the world, I suppose you would host a web UI somewhere.

          • j4k3@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            Here is an example of a context story that can be played out in oobabooga textgen: https://a.lemmy.world/lemmy.world/post/11356125

            You simply copy and paste this into the System Context, add your character description as mentioned in the post, and start chatting with Dors in the web interface. Oobabooga is creating a local hosted web server that your local browser can access. This is the chat dialog graphical interface.

            Overall, this exists in a grey area between a software developer tool and an end user tool. It is not hard to use, but you launch it and install it with a few commands in a terminal. That bit of a complexity users filter is a good thing too. Getting familiar with various models can be challenging. There is a good chance that my story would not work well with other large models. When I put together a story like this, it is written with the help of the model I am using. I ask it to write the concepts by rephrasing things I have prompted.

            All that said, something like prompting a specific choice for a user to follow is tricky. I’ve only been able to do that with smaller models and training a LoRA. It is no game though this is all just playing with model loader context stuff. I can do a lot more hacking around with the Oobabooga code to make characters swap out in unique ways and with unique attributes loaded conditionally, but those are not things I can really share because they are not easily repeatable.

            • lazyneet@programming.dev
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              So basically you write a long ass prompt and feed it to an AI with a longer attention span than any human, and it plays DM. Cool. It makes me wonder if a similar model based on one-on-one conversation could provide the perspective of a specific character according to a similar prompt. I’m not big into AI at this point but I’m sure it wil replace human companionship eventually.