beago brings the Unix philosophy to LLM applications: small, focused handlers connected by pipes. Each handler reads from an io.Reader, transforms the stream, and writes to an io.Writer — exactly like Unix programs connected with |. The core library has no external dependencies.
Unix pipes let you compose small programs into powerful workflows:
echo "text" | translate | summarise | fmt
beago works the same way, but for LLM pipelines:
pipe.Execute(ctx, os.Stdin, os.Stdout,
llm.Prompt("Translate to French."),
llm.Generate(model), // stdin | translate
llm.Prompt("Summarise in one sentence."),
llm.Generate(model), // | summarise
)Each pipe.Handler is a composable unit. Handlers are chained with pipe.Execute, looped with pipe.Loop, and debugged with pipe.Tee — mirroring Unix's tee(1).
- Pipe — the core primitive: a
Handlerthat readsio.Reader→ transforms → writesio.Writer - Execute — chains handlers sequentially, connecting each output to the next input via
io.Pipe - Loop — runs a handler chain repeatedly, feeding each iteration's output as the next input; stops on
ErrDoneor a max iteration count - Tee — splits the stream like Unix
tee(1): passes data through while copying to a second writer for debugging - Agents — ReAct (Reasoning + Acting) loops that interleave LLM reasoning with tool execution
- Tools — implement the
Toolinterface to give agents new capabilities
// Single handler: pipe stdin through an LLM to stdout
// echo "What is 2+2?" | go run .
pipe.Execute(ctx, os.Stdin, os.Stdout,
llm.Generate(model),
)// Chain handlers: translate then summarise
pipe.Execute(ctx, os.Stdin, os.Stdout,
llm.Prompt("Translate to French."),
llm.Generate(model),
llm.Prompt("Summarise in one sentence."),
llm.Generate(model),
)// Loop until the LLM outputs "DONE"
pipe.Execute(ctx, os.Stdin, os.Stdout,
pipe.Loop(10,
llm.Generate(model),
pipe.Exit(func(b []byte) bool {
return bytes.Contains(b, []byte("DONE"))
}),
),
)| Example | Description |
|---|---|
| pipe | Single LLM call — the simplest pipe |
| pipe/tee | Split the stream with Tee to inspect output |
| pipe/chain | Chain two LLM calls: translate → summarise |
| pipe/loop | Loop until a stop condition is met |
| agents | ReAct agent with tools |
Contributions of any kind are welcome! See Get Involved to get started.