Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Context Management for Session in Case of Context Length Breach #116

Open
1 task done
ashish-spext opened this issue Dec 31, 2024 · 0 comments
Open
1 task done
Labels
enhancement New feature or request

Comments

@ashish-spext
Copy link
Contributor

ashish-spext commented Dec 31, 2024

Confirm this is a new feature request

  • I've checked the current issues, and there's no record of this feature request

Describe the feature

Problem:
After a certain number of messages, the context length of the LLM may be exceeded. This scenario is currently unhandled.

Proposed Implementations:

  1. Simple Sliding Window Approach:
    • Continuously remove the oldest messages to make room for new ones as the context approaches its limit.
  2. Smart Context Management with Vector Matching:
    • Implement a vector-based retrieval system to inject only the most relevant older messages (beyond a certain threshold) into the context.

Additional Context

No response

@ashish-spext ashish-spext added the enhancement New feature or request label Dec 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant