Skip to content

synw/agent-smith

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Agent Smith

An api to create local first human friendly agents in the browser or Nodejs

Agent Smith

πŸ“š Read the documentation

Check the πŸ’» examples

What is an agent?

An agent is a language model that can take decisions. It can:

  • Think: use language model servers to perform inference queries
  • Work: manage long running workflows with multiple tasks, using tools
  • Remember: use semantic memory to store data
  • Interact: perform interactions with the user

Packages

Version Name Description Nodejs Browser
pub package @agent-smith/cli Terminal client βœ… ❌
pub package @agent-smith/agent Agent βœ… βœ…
pub package @agent-smith/task Task βœ… βœ…
pub package @agent-smith/smem Semantic memory βœ… ❌
pub package @agent-smith/tfm Templates for models βœ… βœ…

Terminal client plugins

Philosophy

  • Composable: the packages have limited responsibilities and can work together
  • Declarative: focus on the business logic by expressing features simply
  • Explicit: keep it simple and under user control: no hidden magic

Requirements

Supported inference servers:

Example

Terminal client

Simple inference query (using the inference plugin):

lm q list the planets of the solar system

Query a thinking model, use qwq (from the models plugin)::

lm think "solve this math problem: ..." m=qwq

Compare images (using the vision plugin):

lm vision img1.jpg img2.jpg "Compare the images"

Generate a commit message in a git repository (using the git plugin):

lm commit

Terminal client plugins

Plugins for the terminal client are available: terminal client plugins

Nodejs example: using an agent with local tools

import { Agent } from "@agent-smith/agent";
import { Lm } from "@locallm/api";

let model;
const serverUrl = "http://localhost:8080/v1"; // Local Llama.cpp
const apiKey = "";
const system = "You are a helpful touristic assistant";
const _prompt = `I am landing in Barcelona in one hour: I plan to reach my hotel and then go for outdoor sport. 
How are the conditions in the city?`;

function run_get_current_weather(args) {
    console.log("Running the get_current_weather tool with args", args);
    return '{ "temp": 20.5, "weather": "rain" }'
}

function run_get_current_traffic(args) {
    console.log("Running the get_current_traffic tool with args", args);
    return '{ "trafic": "normal" }'
}

const get_current_weather = {
    "name": "get_current_weather",
    "description": "Get the current weather",
    "arguments": {
        "location": {
            "description": "The city and state, e.g. San Francisco, CA",
            "required": true
        }
    },
    execute: run_get_current_weather
};

const get_current_traffic = {
    "name": "get_current_traffic",
    "description": "Get the current road traffic conditions",
    "arguments": {
        "location": {
            "description": "The city and state, e.g. San Francisco, CA",
            "required": true
        }
    },
    execute: run_get_current_traffic
};

async function main() {
    const lm = new Lm({
        providerType: "openai",
        serverUrl: serverUrl,
        apiKey: apiKey,
        onToken: (t) => process.stdout.write(t),
    });
    const agent = new Agent(lm);
    await agent.run(_prompt,
        //inference params
        {
            model: model ?? "",
            temperature: 0.5,
            top_k: 40,
            top_p: 0.95,
            min_p: 0,
            max_tokens: 4096,
        },
        // query options
        {
            debug: false,
            verbose: true,
            system: system,
            tools: [get_current_weather, get_current_traffic]
        });
}

(async () => {
    await main();
})();

Server api example

To execute a task using the server http api:

import { useServer } from "@agent-smith/apicli";

const api = useServer({
    apiKey: "server_api_key",
    onToken: (t) => {
        // handle the streamed tokens here
        process.stdout.write(t)
    }
});
await api.executeTask(
    "translate", 
    "Which is the largest planet of the solar system?", 
    { lang: "german" }
)

Libraries

The cli is powered by:

  • Locallm for the inference api servers management
  • Modprompt for the prompt templates management

The server is powered by:

About

Local first human friendly agents toolkit for the browser and Nodejs

Resources

License

Stars

Watchers

Forks