Jump to content

Eadoo's Text Generation Web UI characters


Recommended Posts

Eadoo's Text Generation Web UI characters

View File

I'm currently experimenting with AI for roleplay chat.  As I do that, I'm developing characters that I'm satisfied with enough to share.

If you'd like to join me, I strongly recommend this video for setup instructions.

 

If you need model suggestions, I used to use these on my 1650 GPU:

Xwin-MLewd-7B-V0.2-GPTQ - I use this one as my general purpose model.

Dans-AdventurousWinds-Mk2-7B-GPTQ - a choose-your-own-adventure style model. I like this for Lewd Dungeon.

Tiamat-7B-GPTQ - a very "to-the-point, to hell with your feelings" conversational model. I use this one for those brutal conversations with nothing held back.

dolphin-2.2.1-mistral-7B-GPTQ - an emotionally sensitive model. I like this one to use with Sarah; I find it leads to emotionally rich conversations

 

I use these primarily on my 3060 GPU:

Chronomaid-Storytelling-13B-GPTQ - My preferred general purpose model.

MythoMax-L2-13B-GPTQ - a good fall-back if I want something different

 

Current characters:

  • Minotaur - captured and raped
  • Rusdall - drugged, and then dragged off to be raped
  • Lewd Dungeon - NSFW dungeon adventure storyteller
  • Goblins - raped, killed, and eaten
  • Suzanne - a young woman goes to her mom for advice and learns how perverted she is
  • Margot - your sister has a serious problem and needs your help
  • Bandits - captured, raped, and sold into slavery
  • Sarah - your wife has had it with your shit
  • Prince Hernan - you are sold to a noble for his use
  • Sylvia - for all you BBC cuckolds out there

  • Submitter
  • Submitted
    12/06/2023
  • Category
  • Requires
    Text generation web UI

 

Link to comment
8 hours ago, Durin said:

This is intriguing, thanks for sharing! Can you recommend a tutorial for getting started with this sort of thing?

 

The video I shared in my original post ("INSTALL UNCENSORED TextGen Ai WebUI LOCALLY in 1 CLICK!") is what got me started. Keep in mind that I only found this and started playing with it a week ago, so there might be other resources that more experienced folks would suggest. Despite that, I still go back to that video (this morning I decided to look at GGUF-formatted models, and there's a section there that helped me with that).

Be aware that older PCs won't be able to handle this very well - you really need a moderately beefy GPU (I'm currently on a 1650, which works OK but could be better).  I have for a while planned on upgrading one of my systems to make the hop from SD to SDXL-based models for AI art anyway, though, and this has convinced me that the time to upgrade is now so that I can do both that and run larger LLMs on this tool.

This is probably gobbledygook for someone new, though, so here was my progression that you might follow along with:

 

Follow along with the video and do the stuff he suggests. Download the model he suggests, and then explore other models that are available from the user he highlights on huggingface (TheBloke).  Stick with 7B size to start, and offload it to your GPU by sticking to the GPTQ and AWQ versions. Once you're comfortable with that, move up to the 13B models to see if they run in the VRAM constraints of your GPU.  If you have a monster GPU, then move up to 30B.

If you're like me and have more modest hardware, look into GGUF and how to split load between system memory and CPU, and GPU and VRAM.

 

Oh, and somewhere in there, create some characters or use mine.  :)

Edited by Eadoo
add link
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. For more information, see our Privacy Policy & Terms of Use