You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
README recommends Beepo-22b as the largest and most powerful LLM. Is there an mmproj that can be used with it? Or are there any plans to release one?
yi-34b is the biggest LLM supporting vision through mmproj. However Qwen 32b is the highest scoring 32b LLM (and the only one capable of even surpassing a few 72b models). So is there an mmproj that could be used with it? Or do you plan releasing one?
Thanks!
The text was updated successfully, but these errors were encountered:
Unfortunately while there is Pixtral, nobody has bothered to add llama.cpp support for it yet. So for now your best options will be Qwen2VL or MiniCPM.
I’m sorry, but I’m not sure which part of your answer applies to each of my questions.
Could you clarify, please?
Thanks!
EDIT - making my way through it...
Beepo-22b and mmproj: as of now, there’s no mmproj available or compatible with Beepo-22b.
Qwen-32b and mmproj: while Pixtral is a multi-modal framework, it currently lacks llama.cpp support. This makes using Qwen-32b with mmproj infeasible at the moment.
So, the best available paths for now are:
MiniCPM (SOTA vision model paired with a small LLM)
Qwen2VL (good vision model paired with a significantly larger LLM)
yi-34b (the only ~30b multi-modal solution for now; not sure how good the vision is for this one, but perhaps the best compromise, overall, for now)
A couple of questions:
README recommends
Beepo-22b
as the largest and most powerful LLM. Is there an mmproj that can be used with it? Or are there any plans to release one?yi-34b
is the biggest LLM supporting vision through mmproj. HoweverQwen 32b
is the highest scoring 32b LLM (and the only one capable of even surpassing a few 72b models). So is there an mmproj that could be used with it? Or do you plan releasing one?Thanks!
The text was updated successfully, but these errors were encountered: