summaryrefslogtreecommitdiff
path: root/Omni/Agent/Provider.hs
AgeCommit message (Collapse)Author
6 daysfix: accumulate streaming tool call arguments across SSE chunksBen Sima
OpenAI's SSE streaming sends tool calls incrementally - the first chunk has the id and function name, subsequent chunks contain argument fragments. Previously each chunk was treated as a complete tool call, causing invalid JSON arguments. - Add ToolCallDelta type with index for partial tool call data - Add StreamToolCallDelta chunk type - Track tool calls by index in IntMap accumulator - Merge argument fragments across chunks via mergeToolCallDelta - Build final ToolCall objects from accumulator when stream ends - Handle new StreamToolCallDelta in Engine.hs pattern match
6 daystelegram: fix parsing, add webpage reader, use geminiBen Sima
- Fix Provider.hs to strip leading whitespace from OpenRouter responses - Fix FunctionCall parser to handle missing 'arguments' field - Use eitherDecode for better error messages on parse failures - Switch to claude-sonnet-4.5 for main agent - Use gemini-2.0-flash for conversation summarization (cheaper) - Add read_webpage tool for fetching and summarizing URLs - Add tagsoup to Haskell deps (unused, kept for future)
7 dayst-247: Add Provider abstraction for multi-backend LLM supportBen Sima
- Create Omni/Agent/Provider.hs with unified Provider interface - Support OpenRouter (cloud), Ollama (local), Amp (subprocess stub) - Add runAgentWithProvider to Engine.hs for Provider-based execution - Add EngineType to Core.hs (EngineOpenRouter, EngineOllama, EngineAmp) - Add --engine flag to 'jr work' command - Worker.hs dispatches to appropriate provider based on engine type Usage: jr work <task-id> # OpenRouter (default) jr work <task-id> --engine=ollama # Local Ollama jr work <task-id> --engine=amp # Amp CLI (stub)