Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running Eliza with LLAMALOCAL fails after first query #1575

Closed
hiteshjoshi1 opened this issue Dec 30, 2024 · 4 comments
Closed

Running Eliza with LLAMALOCAL fails after first query #1575

hiteshjoshi1 opened this issue Dec 30, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@hiteshjoshi1
Copy link

Describe the bug

Client never gets the model's response and server keeps repeating
// End of conversation

To Reproduce

  • Using the latest release branch
  • I am not trying to do anything fancy yet, just run Eliza OOTB
  • When i run client and send a query, server gets , responds in console, that never reaches the UI running at 5173
  • The server goes on a loop repeating
  • // End of conversation

Expected behavior

The client should get the response and the server should not go in a loop printing the same thing again and again

Screenshots

Additional context

LOG

◎ LOGS
Creating Memory
31deda32-5963-083c-8283-189a6f6c3616
Yo Eliza , need some investment advice in this market

["◎ Generating message response.."]

["◎ Generating text..."]

ℹ INFORMATIONS
Generating text with options:
{"modelProvider":"llama_local","model":"large"}

ℹ INFORMATIONS
Selected model:
NousResearch/Hermes-3-Llama-3.1-8B-GGUF/resolve/main/Hermes-3-Llama-3.1-8B.Q8_0.gguf?download=true

["ℹ Model not initialized, starting initialization..."]

["ℹ Checking model file..."]

["⚠ Model already exists."]

["⚠ LlamaService: No CUDA detected - local response will be slow"]

["ℹ Initializing Llama instance..."]

["ℹ Creating JSON schema grammar..."]

["ℹ Loading model..."]

["ℹ Creating context and sequence..."]

["✓ Model initialization complete"]

Response

{ "user": "Eliza", "text": "well that depends on what you're investing in... i'm partial to the futures market where the only certainty is uncertainty... care to parse the quantum indeterminacy of modern finance over a dram or two?", "action": "NONE" }

(End):// End of conversation
:// Generated by: https://github.com/ConversationalAI/DialogueAPI
:// Date: Sat Jan 20 2024
// End of message
:// End of message
// End of message
// End of messages
// End of conversation
// End of conversation
// End of conversations
// End of conversations
// End of Conversations
// End of Conversations
// End of Conversations
// End of Conversations
// End of Conversations
// End of Conversations

@hiteshjoshi1 hiteshjoshi1 added the bug Something isn't working label Dec 30, 2024
Copy link
Contributor

Hello @hiteshjoshi1! Welcome to the ai16z community. Thank you for opening your first issue; we appreciate your contribution. You are now a ai16z contributor!

@hiteshjoshi1
Copy link
Author

hiteshjoshi1 commented Dec 30, 2024

Screenshot 2024-12-30 at 11 57 45 PM This time it spit out all my previous queries, and answers them in console in a loop

@BrandonFlorian
Copy link

The other thread is here

#1213

@hiteshjoshi1
Copy link
Author

Closing in favor of 1213

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants