orca.ai/pkg/llm/llm.go
大森 6b94476347 Initial commit: Orca Agent Framework
Core features:
- Microkernel architecture with Actor model
- Session management with JSONL persistence
- Tool system (5 built-in tools)
- Skill system with SKILL.md parsing
- Sandbox security execution
- Ollama integration with gemma4:e4b
- Prompt-based tool calling (compatible with native function calling)
- REPL interface

11 packages, all tests passing
2026-05-08 00:55:48 +08:00

25 lines
1.0 KiB
Go

// Package llm provides the LLM integration layer for the Orca framework.
//
// It defines the LLM interface for interacting with language models,
// the Ollama client implementation, and the shared types for chat
// messages, tool calls, and streaming responses.
package llm
import "context"
// LLM is the interface for interacting with language models.
//
// Implementations provide Chat (for complete responses) and Stream
// (for streaming token-by-token responses) methods. Both methods
// accept a list of messages and return the model's response.
type LLM interface {
// Chat sends a list of messages to the LLM and returns a complete response.
// If the model decides to call tools, the response contains ToolCalls.
Chat(ctx context.Context, messages []Message) (*Response, error)
// Stream sends messages and streams the response token-by-token.
// The handler is called for each chunk. The final response is not
// collected; use Chat for complete responses.
Stream(ctx context.Context, messages []Message, handler StreamHandler) error
}