Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sync with upstream #12

Draft
wants to merge 82 commits into
base: dev
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
82 commits
Select commit Hold shift + click to select a range
b534738
fix: agent_stream_callback sometimes failed to detect keyword
devinyf Feb 18, 2024
6d008d1
trim potential spaces and colon in the beginning
devinyf Feb 18, 2024
5ba4c9e
Set PrintOutput to be true only once
devinyf Feb 18, 2024
491ba5f
add test case
devinyf Feb 18, 2024
65fc5f7
table driven test
devinyf Feb 19, 2024
d28c808
inline the test cases
devinyf Feb 20, 2024
c4ddf0b
chore: Pinning chroma-go ahead of major new release
tazarov Feb 29, 2024
0804075
Update huggingface.mdx
devalexandre Mar 2, 2024
17002cc
chains: fix add ignore StreamingFunc (#639)
Abirdcfly Mar 5, 2024
0dedf90
Merge pull request #640 from amikos-tech/chore/chroma-go-pin
tmc Mar 6, 2024
66bf2dd
Merge pull request #641 from devalexandre/patch-1
tmc Mar 6, 2024
75d2712
feat: add JSON format option to ollama
corani Mar 6, 2024
997607a
example: add new Ollama functions example
corani Mar 6, 2024
3320490
feat: run integration tests for vector databases using testcontainers…
mdelapenya Mar 6, 2024
75b9f41
Merge branch 'main' into more-its
mdelapenya Mar 6, 2024
1d71027
chore: skip qdrant tests if the OpenAI api key is not set
mdelapenya Mar 6, 2024
b74cceb
Merge pull request #614 from devinyf/fix_agent_stream_callback
tmc Mar 6, 2024
81643a8
Merge pull request #647 from corani/corani/ollamajson
tmc Mar 6, 2024
8cbd678
Merge pull request #648 from mdelapenya/more-its
tmc Mar 6, 2024
a51062f
embeddings: Add Amazon Bedrock embeddings (#643)
sansmoraxz Mar 6, 2024
f67b196
vectorstores: Add support for OpenAI Organization ID header in Chroma
AshDevFr Mar 4, 2024
7947f6d
chore: Refactor Chroma to change functions name unsing `OpenAi` to us…
AshDevFr Mar 6, 2024
3b7175c
Merge pull request #646 from AshDevFr/chroma-openai-org-id
tmc Mar 7, 2024
25c6665
change prompt: defaultMrklPrefix to python version
devinyf Mar 7, 2024
ed99f8c
Merge pull request #653 from devinyf/fix_mrkl_prompt
tmc Mar 8, 2024
6cc64f8
examples: Point to v0.1.5
tmc Mar 8, 2024
01bf6a4
Merge pull request #656 from tmc/update-examples
tmc Mar 8, 2024
9986fd3
vectorstores: fix pgvector issues and add more test (#617)
Abirdcfly Mar 8, 2024
24cb833
googleai: return err not log.Fatal when stream get error (#663)
Abirdcfly Mar 11, 2024
255f6a9
docs: fixup a bunch of links (#659)
codyoss Mar 13, 2024
04e2df3
docs: Add example about retriever (#651)
devalexandre Mar 13, 2024
b82fbbf
Create FUNDING.yml (#669)
tmc Mar 13, 2024
b4e6529
docs: Add documentation for splitters (#649)
devalexandre Mar 13, 2024
ca2969c
tool: sql_database generate sql-query filter redundant text (#612)
devinyf Mar 13, 2024
6eaa82c
llms/anthropic: Implement WithBaseURL and WithHTTPClient for Anthropi…
GRVYDEV Mar 13, 2024
eecf58c
This example hangs if the JSON data is either marked as human or AI, …
timmattison Mar 13, 2024
768bc00
vectorstores/milvus: add WithMetricType (#674)
chenquan Mar 14, 2024
e9fbf3c
chore: Chroma dependency update (#676)
tazarov Mar 15, 2024
101dbf0
vectorstores/milvus: fixed the bug that the collection was created in…
chenquan Mar 15, 2024
ebb5d1a
Implementation proposal for using llamafile (#677)
devalexandre Mar 18, 2024
0218733
documentloaders: add AssemblyAI document loader (#668)
marcusolsson Mar 18, 2024
259311e
LLMs: Add AWS Bedrock LLMs (#666)
sansmoraxz Mar 18, 2024
e108163
docs: Fix docs build (#686)
tmc Mar 19, 2024
325d534
examples: Point to v0.1.6-alpha.1 (#687)
tmc Mar 19, 2024
6045596
llms: Improve json mode support (#683)
tmc Mar 19, 2024
014070c
examples: Update examples to v0.1.6 (#688)
tmc Mar 19, 2024
4065971
llms/openai: Azure: add check for `model` parameter (#691)
janezkenda Mar 19, 2024
8b54683
llms/openai: Optionally supply ResponseFormat (#690)
shawti Mar 19, 2024
dc3e6f6
examples: Update to v0.1.7 (#694)
tmc Mar 19, 2024
4746a5d
Improvements on agents package API (#551)
haochunchang Mar 19, 2024
5635461
llms: add caching functionality for Models (#564)
corani Mar 20, 2024
491288f
httputil: Add httputil package to provide some common helpers (#702)
tmc Mar 21, 2024
be54dc1
examples: Use new debugging helper in example (#703)
tmc Mar 21, 2024
3da52d3
examples: Fix up import in openai debugging example (#704)
tmc Mar 21, 2024
60fa95d
tests: Perform env-var checking skips sooner as to speed up tests (#706)
tmc Mar 21, 2024
4ad2e7d
llms/anthropic: adds full support for messages api (#707)
joeychilson Mar 21, 2024
5460983
llms/bedrock: Fixed error when using Claude3 model and giving Message…
mashiike Mar 22, 2024
83bf27c
readme: Include contributors (#714)
tmc Mar 23, 2024
7fb9a13
llms/cloudflare: Implement Cloudflare Workers AI LLM (#679)
rajaiswal Mar 24, 2024
d5f11f0
add new example for OCR using Claude3's Vision feature with Bedrock (…
mashiike Mar 24, 2024
8d90359
llms: Add mistral hosted inference llm implementation (#717)
aannirajpatel Mar 25, 2024
3932b31
vectorstores/weaviate: Update testcontainer image (#719)
tmc Mar 25, 2024
b6ba669
tooling: Update minimum go version to 1.22, update golangci-lint (#722)
tmc Mar 26, 2024
1261877
openai: Render single text content parts directly (#734)
tmc Mar 31, 2024
319b863
all: set explicit 1.22 version in go.mod (#727)
eliben Mar 31, 2024
d822839
googleai: increase default max tokens setting (#726)
eliben Mar 31, 2024
b15223e
examples: Fix and tidy examples, add nvidia example (#735)
tmc Mar 31, 2024
4174692
llms: Implement tool calling, including parallel tool call request su…
tmc Mar 31, 2024
05ab264
openai: WithEmbeddingModel option is incorrectly designating the des…
devalexandre Mar 31, 2024
0b63daa
llms: Add Seed option to all supporting backends (#732)
devalexandre Mar 31, 2024
a47ef50
examples: Update examples to v0.1.8 (#736)
tmc Mar 31, 2024
97e3644
googleai: vertex - upgrade dep version and increase default max token…
eliben Apr 1, 2024
14a2806
googleai: combine options for googleai and vertex (#743)
eliben Apr 1, 2024
ce2a479
googleai: add safety/harm threshold settings (#744)
eliben Apr 1, 2024
73710c5
GH actions: update lint workflow to newer version of Go (#745)
eliben Apr 1, 2024
930e0fb
vectorstores/milvus: Update testcontainer image (#741)
devalexandre Apr 2, 2024
7bbb2d8
tools/sqldataase: update postgres image (#740)
devalexandre Apr 2, 2024
1f45c81
chains: Update mysql testcontainer image (#739)
devalexandre Apr 2, 2024
33b8795
vectorstores/qdrant: Update testcontainer image (#737)
devalexandre Apr 2, 2024
d161462
doc: fix typo (#758)
XiaoConstantine Apr 7, 2024
8b67ef3
examples: clarify openai-function-call-example (#751)
eliben Apr 7, 2024
ce2f2a3
Merge remote-tracking branch 'upstream/main' into dev-update
Abirdcfly Apr 10, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/FUNDING.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# These are supported funding model platforms
github: tmc
26 changes: 5 additions & 21 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,18 +12,18 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/setup-go@v4
- uses: actions/checkout@v4
- uses: actions/setup-go@v5
with:
go-version: stable
go-version: '1.22'
# Cache is managed by golangci-lint
# https://github.com/actions/setup-go#caching-dependency-files-and-build-outputs
cache: false
- uses: actions/checkout@v3
- name: golangci-lint
uses: golangci/golangci-lint-action@v3.7.0
uses: golangci/golangci-lint-action@v4
with:
args: --timeout=4m
version: v1.55.1
version: v1.57.2
build-examples:
runs-on: ubuntu-latest
steps:
Expand All @@ -36,31 +36,15 @@ jobs:
run: make build-examples
build-test:
runs-on: ubuntu-latest
services:
postgres:
image: pgvector/pgvector:pg16
env:
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/setup-go@v4
with:
go-version: stable
- uses: actions/checkout@v3
- name: Build
run: go build -v ./...
- name: Create pgvector extension
run: PGPASSWORD=postgres psql -h localhost -U postgres -c 'CREATE EXTENSION vector'
- name: Test
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GENAI_API_KEY: ${{ secrets.GENAI_API_KEY }}
PGVECTOR_CONNECTION_STRING: postgresql://postgres:postgres@localhost:5432
run: go test -v ./...
6 changes: 5 additions & 1 deletion .golangci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,12 @@ linters:
- nolintlint # see https://github.com/golangci/golangci-lint/issues/3228.
- depguard # disabling temporarily
- ireturn # disabling temporarily
- perfsprint
- musttag

linters-settings:
cyclop:
max-complexity: 12
funlen:
lines: 90
depguard:
Expand All @@ -46,5 +50,5 @@ linters-settings:
- "**/*_test.go"
- "**/mock/**/*.go"
run:
skip-dirs:
exclude-dirs:
- 'exp'
7 changes: 6 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,14 @@ lint-all:
lint-deps:
@command -v golangci-lint >/dev/null 2>&1 || { \
echo >&2 "golangci-lint not found. Installing..."; \
go install github.com/golangci/golangci-lint/cmd/golangci-lint@v1.55.1; \
go install github.com/golangci/golangci-lint/cmd/golangci-lint@v1.57.1; \
}

.PHONY: docs
docs:
@echo "Generating documentation..."
$(MAKE) -C docs build

.PHONY: test-race
test-race:
go run test -race ./...
Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,3 +62,10 @@ Here are some links to blog posts and articles on using Langchain Go:
- [Using Ollama with LangChainGo](https://eli.thegreenplace.net/2023/using-ollama-with-langchaingo/) - Nov 2023
- [Creating a simple ChatGPT clone with Go](https://sausheong.com/creating-a-simple-chatgpt-clone-with-go-c40b4bec9267?sk=53a2bcf4ce3b0cfae1a4c26897c0deb0) - Aug 2023
- [Creating a ChatGPT Clone that Runs on Your Laptop with Go](https://sausheong.com/creating-a-chatgpt-clone-that-runs-on-your-laptop-with-go-bf9d41f1cf88?sk=05dc67b60fdac6effb1aca84dd2d654e) - Aug 2023


# Contributors

<a href="https://github.com/tmc/langchaingo/graphs/contributors">
<img src="https://contrib.rocks/image?repo=tmc/langchaingo" />
</a>
2 changes: 1 addition & 1 deletion agents/conversational.go
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ type ConversationalAgent struct {

var _ Agent = (*ConversationalAgent)(nil)

func NewConversationalAgent(llm llms.Model, tools []tools.Tool, opts ...CreationOption) *ConversationalAgent {
func NewConversationalAgent(llm llms.Model, tools []tools.Tool, opts ...Option) *ConversationalAgent {
options := conversationalDefaultOptions()
for _, opt := range opts {
opt(&options)
Expand Down
24 changes: 12 additions & 12 deletions agents/executor.go
Original file line number Diff line number Diff line change
Expand Up @@ -26,18 +26,18 @@ type Executor struct {
}

var (
_ chains.Chain = Executor{}
_ callbacks.HandlerHaver = Executor{}
_ chains.Chain = &Executor{}
_ callbacks.HandlerHaver = &Executor{}
)

// NewExecutor creates a new agent executor with an agent and the tools the agent can use.
func NewExecutor(agent Agent, tools []tools.Tool, opts ...CreationOption) Executor {
func NewExecutor(agent Agent, tools []tools.Tool, opts ...Option) *Executor {
options := executorDefaultOptions()
for _, opt := range opts {
opt(&options)
}

return Executor{
return &Executor{
Agent: agent,
Tools: tools,
Memory: options.memory,
Expand All @@ -48,7 +48,7 @@ func NewExecutor(agent Agent, tools []tools.Tool, opts ...CreationOption) Execut
}
}

func (e Executor) Call(ctx context.Context, inputValues map[string]any, _ ...chains.ChainCallOption) (map[string]any, error) { //nolint:lll
func (e *Executor) Call(ctx context.Context, inputValues map[string]any, _ ...chains.ChainCallOption) (map[string]any, error) { //nolint:lll
inputs, err := inputsToString(inputValues)
if err != nil {
return nil, err
Expand Down Expand Up @@ -81,7 +81,7 @@ func (e Executor) Call(ctx context.Context, inputValues map[string]any, _ ...cha
), ErrNotFinished
}

func (e Executor) doIteration( // nolint
func (e *Executor) doIteration( // nolint
ctx context.Context,
steps []schema.AgentStep,
nameToTool map[string]tools.Tool,
Expand Down Expand Up @@ -123,7 +123,7 @@ func (e Executor) doIteration( // nolint
return steps, nil, nil
}

func (e Executor) doAction(
func (e *Executor) doAction(
ctx context.Context,
steps []schema.AgentStep,
nameToTool map[string]tools.Tool,
Expand Down Expand Up @@ -152,7 +152,7 @@ func (e Executor) doAction(
}), nil
}

func (e Executor) getReturn(finish *schema.AgentFinish, steps []schema.AgentStep) map[string]any {
func (e *Executor) getReturn(finish *schema.AgentFinish, steps []schema.AgentStep) map[string]any {
if e.ReturnIntermediateSteps {
finish.ReturnValues[_intermediateStepsOutputKey] = steps
}
Expand All @@ -162,20 +162,20 @@ func (e Executor) getReturn(finish *schema.AgentFinish, steps []schema.AgentStep

// GetInputKeys gets the input keys the agent of the executor expects.
// Often "input".
func (e Executor) GetInputKeys() []string {
func (e *Executor) GetInputKeys() []string {
return e.Agent.GetInputKeys()
}

// GetOutputKeys gets the output keys the agent of the executor returns.
func (e Executor) GetOutputKeys() []string {
func (e *Executor) GetOutputKeys() []string {
return e.Agent.GetOutputKeys()
}

func (e Executor) GetMemory() schema.Memory { //nolint:ireturn
func (e *Executor) GetMemory() schema.Memory { //nolint:ireturn
return e.Memory
}

func (e Executor) GetCallbackHandler() callbacks.Handler { //nolint:ireturn
func (e *Executor) GetCallbackHandler() callbacks.Handler { //nolint:ireturn
return e.CallbacksHandler
}

Expand Down
7 changes: 4 additions & 3 deletions agents/initialize.go
Original file line number Diff line number Diff line change
Expand Up @@ -19,23 +19,24 @@ const (
ConversationalReactDescription AgentType = "conversationalReactDescription"
)

// Deprecated: This may be removed in the future; please use NewExecutor instead.
// Initialize is a function that creates a new executor with the specified LLM
// model, tools, agent type, and options. It returns an Executor or an error
// if there is any issues during the creation process.
func Initialize(
llm llms.Model,
tools []tools.Tool,
agentType AgentType,
opts ...CreationOption,
) (Executor, error) {
opts ...Option,
) (*Executor, error) {
var agent Agent
switch agentType {
case ZeroShotReactDescription:
agent = NewOneShotAgent(llm, tools, opts...)
case ConversationalReactDescription:
agent = NewConversationalAgent(llm, tools, opts...)
default:
return Executor{}, ErrUnknownAgentType
return &Executor{}, ErrUnknownAgentType
}
return NewExecutor(agent, tools, opts...), nil
}
2 changes: 1 addition & 1 deletion agents/mrkl.go
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ var _ Agent = (*OneShotZeroAgent)(nil)
// NewOneShotAgent creates a new OneShotZeroAgent with the given LLM model, tools,
// and options. It returns a pointer to the created agent. The opts parameter
// represents the options for the agent.
func NewOneShotAgent(llm llms.Model, tools []tools.Tool, opts ...CreationOption) *OneShotZeroAgent {
func NewOneShotAgent(llm llms.Model, tools []tools.Tool, opts ...Option) *OneShotZeroAgent {
options := mrklDefaultOptions()
for _, opt := range opts {
opt(&options)
Expand Down
4 changes: 2 additions & 2 deletions agents/mrkl_prompt.go
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ import (
)

const (
_defaultMrklPrefix = `Today is {{.today}} and you can use tools to get new information.
Answer the following questions as best you can using the following tools:
_defaultMrklPrefix = `Today is {{.today}}.
Answer the following questions as best you can. You have access to the following tools:

{{.tool_descriptions}}`

Expand Down
4 changes: 2 additions & 2 deletions agents/openai_functions_agent.go
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ type OpenAIFunctionsAgent struct {
var _ Agent = (*OpenAIFunctionsAgent)(nil)

// NewOpenAIFunctionsAgent creates a new OpenAIFunctionsAgent.
func NewOpenAIFunctionsAgent(llm llms.Model, tools []tools.Tool, opts ...CreationOption) *OpenAIFunctionsAgent {
func NewOpenAIFunctionsAgent(llm llms.Model, tools []tools.Tool, opts ...Option) *OpenAIFunctionsAgent {
options := openAIFunctionsDefaultOptions()
for _, opt := range opts {
opt(&options)
Expand Down Expand Up @@ -132,7 +132,7 @@ func (o *OpenAIFunctionsAgent) GetOutputKeys() []string {
return []string{o.OutputKey}
}

func createOpenAIFunctionPrompt(opts CreationOptions) prompts.ChatPromptTemplate {
func createOpenAIFunctionPrompt(opts Options) prompts.ChatPromptTemplate {
messageFormatters := []prompts.MessageFormatter{prompts.NewSystemMessagePromptTemplate(opts.systemMessage, nil)}
messageFormatters = append(messageFormatters, opts.extraMessages...)
messageFormatters = append(messageFormatters, prompts.NewHumanMessagePromptTemplate("{{.input}}", []string{"input"}))
Expand Down
Loading