Selected Personality Model is not loaded, instead the Gemma base is loaded #3
Loading…
x
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Hi, I set the MODEL_NAME env in my .env file to match the name of the personality model I wish to use so it looks like this, this matches what I get in openweb ui playground as a name:
MODEL_NAME=nikolas-tesla
I've tried with and without single and double quotes and with quotes I get "model not found" but while it works with MODEL_NAME=nikolas-tesla it is very clearly the baseline for gemma and has not loaded nikolas-tesla personality model.
I'm sure its something silly I'm doing wrong. But could use some guidance. Thanks
Are you using Gemma as the base model? Chances are that it is not working the same as the Llama3 model it was written for.
I would recommend trying a different base model, or tweaking the custom model. How does the model respond from within the Open WebUI
I am using Gemma as base model. Open WebUI it responds correctly as I have nikolas-tesla set as the default personality model.
Is the personality set via the System Prompt or through the use of tools/functions/KBs?
Have you tried to perform requests against OWUI API endpoints directly using curl?
If the bot is responding in Discord, then that part of the code is working correctly, and the issue stems from how the model responds over the API.
It just a prompt I believe, I imported it but that's all it seems to be. I have not tried making API calls directly with curl.
I agree when I circle back to this when the ADHD train comes around I'll see if I can figure out how to make those API calls with curl :)
Sounds good, I'll close this issue.