deepseek-coder-6.7b-instruct-8.0bpw-h8-exl2-2 · Hugging Face #189
Labels
llm
Large Language Models
MachineLearning
ML Models, Training and Inference
Models
LLM and ML model repos and links
Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
The text was updated successfully, but these errors were encountered: