Skip to content

Anyone understand how to use "ghost attention" (GAtt) with llama 2? #2541

JohanAR started this conversation in General
Discussion options

You must be logged in to vote

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@JohanAR
Comment options

@darkman111a
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants