-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
adding log_prob option for chat models #219
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review by Korbit AI
Korbit automatically attempts to detect when you fix issues in new commits.
Category | Issue | Fix Detected |
---|---|---|
Unvalidated message attributes ▹ view | ✅ | |
Invalid AIMessage Dictionary Assignment ▹ view | ✅ |
Files scanned
File Path | Reviewed |
---|---|
src/agentlab/llm/base_api.py | ✅ |
src/agentlab/llm/huggingface_utils.py | ✅ |
src/agentlab/llm/chat_api.py | ✅ |
src/agentlab/llm/llm_utils.py | ✅ |
Explore our documentation to understand the languages and file types we support and the files we ignore.
Need a new review? Comment
/korbit-review
on this PR and I'll review your latest changes.Korbit Guide: Usage and Customization
Interacting with Korbit
- You can manually ask Korbit to review your PR using the
/korbit-review
command in a comment at the root of your PR.- You can ask Korbit to generate a new PR description using the
/korbit-generate-pr-description
command in any comment on your PR.- Too many Korbit comments? I can resolve all my comment threads if you use the
/korbit-resolve
command in any comment on your PR.- Chat with Korbit on issues we post by tagging @korbit-ai in your reply.
- Help train Korbit to improve your reviews by giving a 👍 or 👎 on the comments Korbit posts.
Customizing Korbit
- Check out our docs on how you can make Korbit work best for you and your team.
- Customize Korbit for your organization through the Korbit Console.
Feedback and Support
res = AIMessage(completion.choices[0].message.content) | ||
if self.log_probs: | ||
res["log_probs"] = completion.choices[0].log_probs |
This comment was marked as resolved.
This comment was marked as resolved.
Sorry, something went wrong.
src/agentlab/llm/llm_utils.py
Outdated
def __init__(self, role: str, content: Union[str, list[dict]], **kwargs): | ||
self["role"] = role | ||
self["content"] = deepcopy(content) | ||
self.update(kwargs) |
This comment was marked as resolved.
This comment was marked as resolved.
Sorry, something went wrong.
@@ -21,6 +21,7 @@ class BaseModelArgs(ABC): | |||
max_new_tokens: int = None | |||
temperature: float = 0.1 | |||
vision_support: bool = False | |||
log_probs: bool = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The log_probs argument is now part of all chat_model_args, and has to be set to True in your llm config @optimass
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
With this option on (in LLM configs), the chat_messages will also save log-probabilities that will be saved along with the outputs.
Description by Korbit AI
What change is being made?
Add support for logging probability (
log_probs
) in chat models by introducing alog_probs
option across various components of the chat model architecture.Why are these changes being made?
To allow users to obtain the probabilities associated with model predictions, providing insight into model confidence and improving model interpretability. This change enhances the flexibility and functionality of the chat model by giving users the option to access additional predictive information if desired.