Add Google Gemini provider support

- Add GeminiProvider with streaming and native tool calling
- Support gemini-2.5-pro, gemini-2.0-flash, gemini-1.5-pro/flash models
- Model-specific context window detection (1M-2M tokens)
- Message conversion: assistant -> model role mapping
- System messages extracted to system_instruction field
- Tool schema conversion with functionCall/functionResponse parts
- SSE streaming with JSON array buffer parsing
- 8 unit tests for conversion and parsing logic
- Register provider in g3-core and validate in g3-cli
This commit is contained in:
Dhanji R. Prasanna
2026-01-29 10:11:42 +11:00
parent fe33568ee0
commit 735e9c9312
6 changed files with 860 additions and 2 deletions

View File

@@ -241,12 +241,14 @@ pub struct Tool {
pub mod anthropic;
pub mod databricks;
pub mod embedded;
pub mod gemini;
pub mod oauth;
pub mod openai;
pub use anthropic::AnthropicProvider;
pub use databricks::DatabricksProvider;
pub use embedded::EmbeddedProvider;
pub use gemini::GeminiProvider;
pub use openai::OpenAIProvider;
impl Message {