Skip to content

[ICLR 2025] Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

License

Notifications You must be signed in to change notification settings

MAGICS-LAB/Chain-of-Actions

Repository files navigation

Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

PWC PWC PWC PWC

This is the code of our ICLR 2025 paper Chain-of-Action. You can use this repo to reproduce the results in the paper.

  • You can try to run our project by following the steps below, running in different environments may encounter various problems. We are still working hard to make it robust and bug-free.
  • You should use your own OpenAI API and Google search API, which is required in our baseline and paper code.

Datasets

Download the datasets from the following:

https://github.com/google/BIG-bench/tree/main/bigbench/benchmark_tasks

https://fever.ai/dataset/fever.html

https://huggingface.co/datasets/Stanford/web_questions

Chain of Action

Environmental Setup

pip install -r requirements.txt

Run Experiments

An example on Dataset in the setting without IR:

python chain-of-search-wo-ir.py

An example on Dataset in the setting with IR:

python chain-of-search.py

Baseline

Environmental Setup

You can set up the experimental environment by running the following command line:

$ cd baselines/src
$ pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
pip install -r requirements.txt
$ pip3 install -r requirements.txt
$ export PYTHONPATH=$PYTHONPATH:$PWD

Instructions

You can run any baseline with the code we provide.

An example on Dataset in the setting with React:

python react.py

Acknowledgment

The experiments in this work benefit from the following open-source codes:

https://github.com/xsc1234/Search-in-the-Chain

https://github.com/amazon-science/auto-cot

https://python.langchain.com/v0.1/docs/modules/agents/

https://github.com/stanfordnlp/dspy

https://github.com/lucidrains/toolformer-pytorch

https://github.com/princeton-nlp/tree-of-thought-llm

https://www.promptingguide.ai/

Citation

If you find our work useful, please consider citing our paper:

@inproceedings{
pan2025chainofaction,
title={Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models},
author={Zhenyu Pan and Haozheng Luo and Manling Li and Han Liu},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=1BdPHbuimc}
}

About

[ICLR 2025] Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published