Skip to content
/ aichat Public

All-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, AI tools & agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more.

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Notifications You must be signed in to change notification settings

sigoden/aichat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AIChat: All-in-one AI CLI Tool

CI Crates Discord

AIChat is an all-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, AI Tools & Agents, and More.

Install

Package Managers

  • Rust Developers: cargo install aichat
  • Homebrew/Linuxbrew Users: brew install aichat
  • Pacman Users: yay -S aichat
  • Windows Scoop Users: scoop install aichat
  • Android Termux Users: pkg install aichat

Pre-built Binaries

Download pre-built binaries for macOS, Linux, and Windows from GitHub Releases, extract them, and add the aichat binary to your $PATH.

Get Started

Upon its first launch after installation, AIChat will guide you through the initialization of the configuration file.

aichat-init-config

You can tailor AIChat to your preferences by editing the configuration file.

The config.example.yaml file provides a comprehensive list of all configuration options with detailed explanations.

Features

Multi-Platform Support

Effortlessly connect with over 20 leading LLM platforms through a unified interface:

  • OpenAI: GPT-4/GPT-3.5 (paid, chat, embedding, vision, function-calling)
  • Gemini: Gemini-1.5/Gemini-1.0 (free, paid, chat, embedding, vision, function-calling)
  • Claude: Claude-3.5/Claude-3 (paid, chat, vision, function-calling)
  • Ollama: (free, local, chat, embedding, vision, function-calling)
  • Groq: Llama-3.1/Mixtral/Gemma2 (free, chat, function-calling)
  • Azure-OpenAI: GPT-4/GPT-3.5 (paid, chat, embedding, vision, function-calling)
  • VertexAI: Gemini/Claude/Mistral (paid, chat, embedding, vision, function-calling)
  • Bedrock: Llama3.1/Claude3.5/Mistral/Command-R+ (paid, chat, embedding, function-calling)
  • Mistral: (paid, chat, embedding, function-calling)
  • Cohere: Command-R/Command-R+ (paid, chat, embedding, reranker, function-calling)
  • Perplexity: Llama-3/Mixtral (paid, chat, online)
  • Cloudflare: (free, chat, embedding)
  • OpenRouter: (paid, chat, function-calling)
  • Replicate: (paid, chat)
  • Ernie: (paid, chat, embedding, reranker, function-calling)
  • Qianwen: Qwen (paid, chat, embedding, vision, function-calling)
  • Moonshot: (paid, chat, function-calling)
  • Deepseek: (paid, chat, function-calling)
  • ZhipuAI: GLM-4 (paid, chat, embedding, vision, function-calling)
  • LingYiWanWu: Yi-Large (paid, chat, vision, function-calling)
  • Jina: (free, paid, embedding, reranker)
  • VoyageAI: (paid, embedding, reranker)
  • OpenAI-Compatible Platforms

CMD & REPL Modes

AIChat supports both CMD and REPL modes to meet the needs and tastes of different users.

CMD REPL
-m, --model <model> .model <model>
-r, --role <role> .role <role>
--prompt <prompt> .prompt <text>
-s, --session [<session>] .session [<session>]
-a, --agent <agent> .agent <agent>
-R, --rag <rag> .rag <rag>
-f, --file <file/url> .file <file/url>
--info .info

aichat-cmd-mode aichat-repl-mode

Shell Assistant

Supercharge your command line experience. Simply describe your desired actions in natural language, and let AIChat translate your requests into precise shell commands.

aichat-execute

OS-Aware Intelligence: AIChat tailors commands to your specific operating system and shell environment.

Custom Roles

Define custom roles to tailor LLM behaviors, enhancing interactions and boosting productivity.

aichat-role

Session Management

Maintain context-aware conversations through sessions, ensuring continuity in interactions.

aichat-session

RAG

Integrate external documents into your AI conversations for more accurate and contextually relevant responses.

aichat-rag

Function Calling

Function calling supercharges LLMs by connecting them to external tools and data sources. This unlocks a world of possibilities, enabling LLMs to go beyond their core capabilities and tackle a wider range of tasks.

We have created a new repository https://github.com/sigoden/llm-functions to help you make the most of this feature.

Tools

Integrate external tools to automate tasks, retrieve information, and perform actions directly within your workflow.

aichat-tool

Agents

While tools excel at specific tasks, agents offer a more sophisticated approach to problem-solving and user interaction.

Agent = Prompt (Role) + Tools (Function Callings) + Knowndge (RAG). It's also known as OpenAI's GPTs.

aichat-agent

Local Server

AIChat comes with a built-in lightweight http server.

$ aichat --serve
Chat Completions API: http://127.0.0.1:8000/v1/chat/completions
Embeddings API:       http://127.0.0.1:8000/v1/embeddings
LLM Playground:       http://127.0.0.1:8000/playground
LLM Arena:            http://127.0.0.1:8000/arena?num=2

Proxy LLM APIs

AIChat offers the ability to function as a proxy server for all LLMs. This allows you to interact with different LLMs using the familiar OpenAI API format, simplifying the process of accessing and utilizing these LLMs.

Test with curl:

curl -X POST -H "Content-Type: application/json" -d '{
  "model":"claude:claude-3-opus-20240229",
  "messages":[{"role":"user","content":"hello"}], 
  "stream":true
}' http://127.0.0.1:8000/v1/chat/completions

LLM Playground

The LLM Playground is a webapp that allows you to interact with any LLM supported by AIChat directly in your browser.

aichat-llm-playground

LLM Arena

The LLM Arena is a web-based platform where you can compare different LLMs side-by-side.

aichat-llm-arena

Custom Themes

AIChat supports custom dark and light themes, which highlight response text and code blocks.

aichat-themes

Documentation

License

Copyright (c) 2023-2024 aichat-developers.

AIChat is made available under the terms of either the MIT License or the Apache License 2.0, at your option.

See the LICENSE-APACHE and LICENSE-MIT files for license details.