deepseek-ai/deepseek-coder-5.7bmqa-base · Hugging Face #383
Labels
AI-Agents
Autonomous AI agents using LLMs
embeddings
vector embeddings and related tools
github
gh tools like cli, Actions, Issues, Pages
llm
Large Language Models
Models
LLM and ML model repos and links
New-Label
Choose this option if the existing labels are insufficient to describe the content accurately
python
Python code, tools, info
shell-script
shell scripting in Bash, ZSH, POSIX etc
Deepseek Coder Introduction
Deepseek Coder is a series of code language models, each trained from scratch on 2T tokens with a composition of 87% code and 13% natural language in both English and Chinese. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on a project-level code corpus with a window size of 16K and an extra fill-in-the-blank task, supporting project-level code completion and infilling. Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
Key Features
Model Summary
How to Use
This section provides examples of how to use the Deepseek Coder model for code completion, code insertion, and repository-level code completion tasks.
Code Completion
Code Insertion
Repository Level Code Completion
License
This code repository is licensed under the MIT License. The use of Deepseek Coder models is subject to the Model License. DeepSeek Coder supports commercial use.
See the LICENSE-MODEL for more details.
Contact
If you have any questions, please raise an issue or contact us at [email protected].
Suggested labels
{ "key": "llm-experiments", "value": "Experiments and results related to Large Language Models" } { "key": "AI-Chatbots", "value": "Topics related to advanced chatbot platforms integrating multiple AI models" }
The text was updated successfully, but these errors were encountered: