diff --git a/documentation/blog/2025-08-27-get-started-for-free-with-tetrate/index.md b/documentation/blog/2025-08-27-get-started-for-free-with-tetrate/index.md index 3b069dfb8da0..28243c458d0e 100644 --- a/documentation/blog/2025-08-27-get-started-for-free-with-tetrate/index.md +++ b/documentation/blog/2025-08-27-get-started-for-free-with-tetrate/index.md @@ -1,6 +1,6 @@ --- -title: "Your First Goose Experience Is On Us" -description: New Goose users receive $10 in Tetrate Agent Router credits for instant access to multiple models including GPT-5 and Sonnet-4. +title: "Your First goose Experience Is On Us" +description: New goose users receive $10 in Tetrate Agent Router credits for instant access to multiple models including GPT-5 and Sonnet-4. authors: - mic - rizel @@ -8,18 +8,18 @@ authors: ![](tetrate-header.png) - You shouldn’t need a credit card to vibe code with Goose. While Goose is completely free to use, the reality is that most performant LLMs aren't. You want to experience Goose in action without breaking the bank or jumping through hoops. We've been thinking about how to make that first step easier for newcomers to Goose. + You shouldn’t need a credit card to vibe code with goose. While goose is completely free to use, the reality is that most performant LLMs aren't. You want to experience goose in action without breaking the bank or jumping through hoops. We've been thinking about how to make that first step easier for newcomers to goose. -That's why we're thrilled about our newest provider integration: [Tetrate's Agent Router Service](https://router.tetrate.ai). From August 27th through October 2nd, new Goose users can get $10 in credits to use Goose with any model on the Tetrate platform. +That's why we're thrilled about our newest provider integration: [Tetrate's Agent Router Service](https://router.tetrate.ai). New goose users can get $10 in credits to use goose with any model on the Tetrate platform. -We've upgraded the onboarding flow. Tetrate Agent Router now appears as a [recommended setup option](/docs/getting-started/installation#set-llm-provider) for new users. Selecting Tetrate takes you through OAuth account creation, then drops you back into Goose with your $10 credits ready to go. +We've upgraded the onboarding flow. Tetrate Agent Router now appears as a [recommended setup option](/docs/getting-started/installation#set-llm-provider) for new users. Selecting Tetrate takes you through OAuth account creation, then drops you back into goose with your $10 credits ready to go. ![fresh install](welcome.png) -This integration gives Goose users: +This integration gives goose users: * **Instant access** to models without manual setup * **$10 in credits** to start building without a paywall * **A unified model layer** powered by Tetrate @@ -34,17 +34,17 @@ Tetrate's Agent Router Service provides unified access to a comprehensive collec Tetrate brings years of expertise in routing and infrastructure to the AI space. As major contributors to open source projects like Istio and Envoy, they understand how to build reliable, scalable routing systems. Now they're applying this same expertise to LLM traffic management. -LLM requests are inherently stateless, making them ideal for intelligent routing across multiple providers and models. This allows you to optimize for cost, speed, availability, or quality, or even use multiple models to cross-check results. Terminology in this space is still settling. Goose refers to Tetrate as a “provider” for consistency, though under the hood it is a router service that connects to other providers. That layer abstracts away model selection, auth, and host config, keeping your setup clean. +LLM requests are inherently stateless, making them ideal for intelligent routing across multiple providers and models. This allows you to optimize for cost, speed, availability, or quality, or even use multiple models to cross-check results. Terminology in this space is still settling. goose refers to Tetrate as a “provider” for consistency, though under the hood it is a router service that connects to other providers. That layer abstracts away model selection, auth, and host config, keeping your setup clean. ## Why This Collaboration Matters -Our goal is simple: make Goose accessible to everyone, immediately. That means removing barriers to getting started. Tetrate's generous credit offering and seamless integration help us achieve exactly that. +Our goal is simple: make goose accessible to everyone, immediately. That means removing barriers to getting started. Tetrate's generous credit offering and seamless integration help us achieve exactly that. It also reflects Tetrate's ongoing commitment to open source and making AI development more accessible to developers worldwide. ## Explore the Full Model Catalog -While Goose auto-configures with Sonnet-4 by default, you have access to Tetrate's entire model catalog through the interface: +While goose auto-configures with Sonnet-4 by default, you have access to Tetrate's entire model catalog through the interface: ![providers](providers.png) ![gpt5](gpt5.png) @@ -55,26 +55,26 @@ Browse and select from a wide range of options, including: - **Specialized models** optimized for different use cases :::tip Protip - Want the best of both worlds? Use Goose’s [Lead/Worker configuration](/docs/tutorials/lead-worker) to combine a powerful frontier model with a faster open-weight model. Let your Lead handle the high-level thinking while Workers take care of the repetitive tasks—saving you both time and credits. + Want the best of both worlds? Use goose’s [Lead/Worker configuration](/docs/tutorials/lead-worker) to combine a powerful frontier model with a faster open-weight model. Let your Lead handle the high-level thinking while Workers take care of the repetitive tasks—saving you both time and credits. ::: --- Thank you to Tetrate for supporting open source and making AI development more accessible! -**What are you waiting for?** [Get started with Goose](/) +**What are you waiting for?** [Get started with goose](/) *Got questions?* Explore our [docs](/docs/category/guides), browse the [blog](/blog), or join the conversation in our [Discord](https://discord.gg/block-opensource) and [GitHub Discussions](https://github.com/block/goose/discussions). We’d love to have you. - + - + - - + + \ No newline at end of file diff --git a/documentation/docs/getting-started/providers.md b/documentation/docs/getting-started/providers.md index eea499437dc8..02902460ceb8 100644 --- a/documentation/docs/getting-started/providers.md +++ b/documentation/docs/getting-started/providers.md @@ -10,7 +10,7 @@ import { ModelSelectionTip } from '@site/src/components/ModelSelectionTip'; # Supported LLM Providers -Goose is compatible with a wide range of LLM providers, allowing you to choose and integrate your preferred model. +goose is compatible with a wide range of LLM providers, allowing you to choose and integrate your preferred model. :::tip Model Selection @@ -33,8 +33,8 @@ Goose is compatible with a wide range of LLM providers, allowing you to choose a | [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` | | [LiteLLM](https://docs.litellm.ai/docs/) | LiteLLM proxy supporting multiple models with automatic prompt caching and unified API access. | `LITELLM_HOST`, `LITELLM_BASE_PATH` (optional), `LITELLM_API_KEY` (optional), `LITELLM_CUSTOM_HEADERS` (optional), `LITELLM_TIMEOUT` (optional) | | [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](#local-llms).** | `OLLAMA_HOST` | -| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](#local-llms).** | `OLLAMA_HOST` | -| [OpenAI](https://platform.openai.com/api-keys) | Provides gpt-4o, o1, and other advanced language models. Also supports OpenAI-compatible endpoints (e.g., self-hosted LLaMA, vLLM, KServe). **o1-mini and o1-preview are not supported because Goose uses tool calling.** | `OPENAI_API_KEY`, `OPENAI_HOST` (optional), `OPENAI_ORGANIZATION` (optional), `OPENAI_PROJECT` (optional), `OPENAI_CUSTOM_HEADERS` (optional) | +| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](#local-llms).** | `OLLAMA_HOST` | +| [OpenAI](https://platform.openai.com/api-keys) | Provides gpt-4o, o1, and other advanced language models. Also supports OpenAI-compatible endpoints (e.g., self-hosted LLaMA, vLLM, KServe). **o1-mini and o1-preview are not supported because goose uses tool calling.** | `OPENAI_API_KEY`, `OPENAI_HOST` (optional), `OPENAI_ORGANIZATION` (optional), `OPENAI_PROJECT` (optional), `OPENAI_CUSTOM_HEADERS` (optional) | | [OpenRouter](https://openrouter.ai/) | API gateway for unified access to various models with features like rate-limiting management. | `OPENROUTER_API_KEY` | | [Snowflake](https://docs.snowflake.com/user-guide/snowflake-cortex/aisql#choosing-a-model) | Access the latest models using Snowflake Cortex services, including Claude models. **Requires a Snowflake account and programmatic access token (PAT)**. | `SNOWFLAKE_HOST`, `SNOWFLAKE_TOKEN` | | [Tetrate Agent Router Service](https://router.tetrate.ai) | Unified API gateway for AI models including Claude, Gemini, GPT, open-weight models, and others. Supports PKCE authentication flow for secure API key generation. | `TETRATE_API_KEY`, `TETRATE_HOST` (optional) | @@ -43,7 +43,7 @@ Goose is compatible with a wide range of LLM providers, allowing you to choose a ## CLI Providers -Goose also supports special "pass-through" providers that work with existing CLI tools, allowing you to use your subscriptions instead of paying per token: +goose also supports special "pass-through" providers that work with existing CLI tools, allowing you to use your subscriptions instead of paying per token: | Provider | Description | Requirements | |-----------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| @@ -58,10 +58,40 @@ CLI providers are cost-effective alternatives that use your existing subscriptio ## Configure Provider -To configure your chosen provider or see available options, visit the `Models` tab in the Goose Desktop or run `goose configure` in the CLI. +To configure your chosen provider or see available options, visit the `Models` tab in goose Desktop or run `goose configure` in the CLI. - + + **First-time users:** + + On the welcome screen the first time you open goose, you have three options: + - **Automatic setup with [Tetrate Agent Router](https://tetrate.io/products/tetrate-agent-router-service)** + - **Automatic Setup with [OpenRouter](https://openrouter.ai/)** + - **Other Providers** + + + We recommend starting with Tetrate Agent Router. Tetrate provides access to multiple AI models with built-in rate limiting and automatic failover. + + :::info Free Credits Offer + You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through goose. This offer is available to both new and existing Tetrate users. + ::: + 1. Choose `Automatic setup with Tetrate Agent Router`. + 2. goose will open a browser window for you to authenticate with Tetrate, or create a new account if you don't have one already. + 3. When you return to the goose desktop app, you're ready to begin your first session. + + + + 1. Choose `Automatic setup with OpenRouter`. + 2. goose will open a browser window for you to authenticate with OpenRouter, or create a new account if you don't have one already. + 3. When you return to the goose desktop app, you're ready to begin your first session. + + + + 1. If you have a specific provider you want to use with goose, and an API key from that provider, choose `Other Providers`. + 2. Find the provider of your choice and click its `Configure` button. If you don't see your provider in the list, click `Add Custom Provider` at the bottom of the window. + 3. Depending on your provider, you'll need to input your API Key, API Host, or other optional [parameters](#available-providers). Click the `Submit` button to authenticate and begin your first session. + + **To update your LLM provider and API key:** 1. Click the button in the top-left to open the sidebar 2. Click the `Settings` button on the sidebar @@ -89,8 +119,8 @@ To configure your chosen provider or see available options, visit the `Models` t 3. Click the `Models` tab 4. Click `Reset Provider and Model` to clear your current settings and return to the welcome screen - - 1. Run the following command: + + 1. In your terminal, run the following command: ```sh goose configure @@ -106,7 +136,7 @@ To configure your chosen provider or see available options, visit the `Models` t │ ○ Add Extension │ ○ Toggle Extensions │ ○ Remove Extension - │ ○ Goose Settings + │ ○ goose Settings └ ``` 3. Choose a model provider and press Enter. @@ -157,13 +187,13 @@ To configure your chosen provider or see available options, visit the `Models` t │ ◓ Checking your configuration... └ Configuration saved successfully -``` + ``` ## Using Custom OpenAI Endpoints -Goose supports using custom OpenAI-compatible endpoints, which is particularly useful for: +goose supports using custom OpenAI-compatible endpoints, which is particularly useful for: - Self-hosted LLMs (e.g., LLaMA, Mistral) using vLLM or KServe - Private OpenAI-compatible API servers - Enterprise deployments requiring data governance and security compliance @@ -220,7 +250,7 @@ Goose supports using custom OpenAI-compatible endpoints, which is particularly u ### Setup Instructions - + 1. Click the button in the top-left to open the sidebar 2. Click the `Settings` button on the sidebar 3. Click the `Models` tab @@ -233,7 +263,7 @@ Goose supports using custom OpenAI-compatible endpoints, which is particularly u - Project (for resource management) 7. Click `Submit` - + 1. Run `goose configure` 2. Select `Configure Providers` 3. Choose `OpenAI` as the provider @@ -249,19 +279,19 @@ Goose supports using custom OpenAI-compatible endpoints, which is particularly u For enterprise deployments, you can pre-configure these values using environment variables or configuration files to ensure consistent governance across your organization. ::: -## Using Goose for Free +## Using goose for Free -Goose is a free and open source AI agent that you can start using right away, but not all supported [LLM Providers][providers] provide a free tier. +goose is a free and open source AI agent that you can start using right away, but not all supported [LLM Providers][providers] provide a free tier. Below, we outline a couple of free options and how to get started with them. :::warning Limitations -These free options are a great way to get started with Goose and explore its capabilities. However, you may need to upgrade your LLM for better performance. +These free options are a great way to get started with goose and explore its capabilities. However, you may need to upgrade your LLM for better performance. ::: ### Groq -Groq provides free access to open source models with high-speed inference. To use Groq with Goose, you need an API key from [Groq Console](https://console.groq.com/keys). +Groq provides free access to open source models with high-speed inference. To use Groq with goose, you need an API key from [Groq Console](https://console.groq.com/keys). Groq offers several open source models that support tool calling: - **moonshotai/kimi-k2-instruct** - Mixture-of-Experts model with 1 trillion parameters, optimized for agentic intelligence and tool use @@ -269,10 +299,10 @@ Groq offers several open source models that support tool calling: - **gemma2-9b-it** - Google's Gemma 2 model with instruction tuning - **llama-3.3-70b-versatile** - Meta's Llama 3.3 model for versatile applications -To set up Groq with Goose, follow these steps: +To set up Groq with goose, follow these steps: - + **To update your LLM provider and API key:** 1. Click the button in the top-left to open the sidebar. @@ -283,7 +313,7 @@ To set up Groq with Goose, follow these steps: 6. Click `Configure`, enter your API key, and click `Submit`. - + 1. Run: ```sh goose configure @@ -296,12 +326,12 @@ To set up Groq with Goose, follow these steps: ### Google Gemini -Google Gemini provides a free tier. To start using the Gemini API with Goose, you need an API Key from [Google AI studio](https://aistudio.google.com/app/apikey). +Google Gemini provides a free tier. To start using the Gemini API with goose, you need an API Key from [Google AI studio](https://aistudio.google.com/app/apikey). -To set up Google Gemini with Goose, follow these steps: +To set up Google Gemini with goose, follow these steps: - + **To update your LLM provider and API key:** 1. Click the button in the top-left to open the sidebar. @@ -312,7 +342,7 @@ To set up Google Gemini with Goose, follow these steps: 6. Click `Configure`, enter your API key, and click `Submit`. - + 1. Run: ```sh goose configure @@ -347,10 +377,10 @@ To set up Google Gemini with Goose, follow these steps: ### Local LLMs -Goose is a local AI agent, and by using a local LLM, you keep your data private, maintain full control over your environment, and can work entirely offline without relying on cloud access. However, please note that local LLMs require a bit more set up before you can use one of them with Goose. +goose is a local AI agent, and by using a local LLM, you keep your data private, maintain full control over your environment, and can work entirely offline without relying on cloud access. However, please note that local LLMs require a bit more set up before you can use one of them with goose. :::warning Limited Support for models without tool calling -Goose extensively uses tool calling, so models without it can only do chat completion. If using models without tool calling, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). +goose extensively uses tool calling, so models without it can only do chat completion. If using models without tool calling, all goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). ::: Here are some local providers we support: @@ -362,7 +392,7 @@ Here are some local providers we support: 1. [Download Ramalama](https://github.com/containers/ramalama?tab=readme-ov-file#install). 2. In a terminal, run any Ollama [model supporting tool-calling](https://ollama.com/search?c=tools) or [GGUF format HuggingFace Model](https://huggingface.co/search/full-text?q=%22tools+support%22+%2B+%22gguf%22&type=model): - The `--runtime-args="--jinja"` flag is required for Ramalama to work with the Goose Ollama provider. + The `--runtime-args="--jinja"` flag is required for Ramalama to work with the goose Ollama provider. Example: @@ -370,7 +400,7 @@ Here are some local providers we support: ramalama serve --runtime-args="--jinja" ollama://qwen2.5 ``` - 3. In a separate terminal window, configure with Goose: + 3. In a separate terminal window, configure with goose: ```sh goose configure @@ -388,7 +418,7 @@ Here are some local providers we support: └ ``` - 5. Choose `Ollama` as the model provider since Ramalama is API compatible and can use the Goose Ollama provider + 5. Choose `Ollama` as the model provider since Ramalama is API compatible and can use the goose Ollama provider ``` ┌ goose-configure @@ -451,12 +481,12 @@ Here are some local providers we support: ``` :::tip Context Length - If you notice that Goose is having trouble using extensions or is ignoring [.goosehints](/docs/guides/using-goosehints), it is likely that the model's default context length of 2048 tokens is too low. Use `ramalama serve` to set the `--ctx-size, -c` option to a [higher value](https://github.com/containers/ramalama/blob/main/docs/ramalama-serve.1.md#--ctx-size--c). + If you notice that goose is having trouble using extensions or is ignoring [.goosehints](/docs/guides/using-goosehints), it is likely that the model's default context length of 2048 tokens is too low. Use `ramalama serve` to set the `--ctx-size, -c` option to a [higher value](https://github.com/containers/ramalama/blob/main/docs/ramalama-serve.1.md#--ctx-size--c). ::: - The native `DeepSeek-r1` model doesn't support tool calling, however, we have a [custom model](https://ollama.com/michaelneale/deepseek-r1-goose) you can use with Goose. + The native `DeepSeek-r1` model doesn't support tool calling, however, we have a [custom model](https://ollama.com/michaelneale/deepseek-r1-goose) you can use with goose. :::warning Note that this is a 70B model size and requires a powerful device to run smoothly. @@ -470,7 +500,7 @@ Here are some local providers we support: ollama run michaelneale/deepseek-r1-goose ``` - 3. In a separate terminal window, configure with Goose: + 3. In a separate terminal window, configure with goose: ```sh goose configure @@ -555,7 +585,7 @@ Here are some local providers we support: ollama run qwen2.5 ``` - 3. In a separate terminal window, configure with Goose: + 3. In a separate terminal window, configure with goose: ```sh goose configure @@ -638,7 +668,7 @@ Here are some local providers we support: ``` :::tip Context Length - If you notice that Goose is having trouble using extensions or is ignoring [.goosehints](/docs/guides/using-goosehints), it is likely that the model's default context length of 4096 tokens is too low. Set the `OLLAMA_CONTEXT_LENGTH` environment variable to a [higher value](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size). + If you notice that goose is having trouble using extensions or is ignoring [.goosehints](/docs/guides/using-goosehints), it is likely that the model's default context length of 4096 tokens is too low. Set the `OLLAMA_CONTEXT_LENGTH` environment variable to a [higher value](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size). ::: @@ -655,7 +685,7 @@ Here are some local providers we support: docker model pull hf.co/unsloth/gemma-3n-e4b-it-gguf:q6_k ``` - 4. Configure Goose to use Docker Model Runner, using the OpenAI API compatible endpoint: + 4. Configure goose to use Docker Model Runner, using the OpenAI API compatible endpoint: ```sh goose configure @@ -718,7 +748,7 @@ Here are some local providers we support: Docker model runner uses `/engines/llama.cpp/v1/chat/completions` for the base path. - 9. Finally configure the model available in Docker Model Runner to be used by Goose: `hf.co/unsloth/gemma-3n-e4b-it-gguf:q6_k` + 9. Finally configure the model available in Docker Model Runner to be used by goose: `hf.co/unsloth/gemma-3n-e4b-it-gguf:q6_k` ``` │ @@ -735,7 +765,7 @@ Here are some local providers we support: ## Azure OpenAI Credential Chain -Goose supports two authentication methods for Azure OpenAI: +goose supports two authentication methods for Azure OpenAI: 1. **API Key Authentication** - Uses the `AZURE_OPENAI_API_KEY` for direct authentication 2. **Azure Credential Chain** - Uses Azure CLI credentials automatically without requiring an API key @@ -757,7 +787,7 @@ Beyond single-model setups, goose supports [multi-model configurations](/docs/gu --- -If you have any questions or need help with a specific provider, feel free to reach out to us on [Discord](https://discord.gg/block-opensource) or on the [Goose repo](https://github.com/block/goose). +If you have any questions or need help with a specific provider, feel free to reach out to us on [Discord](https://discord.gg/block-opensource) or on the [goose repo](https://github.com/block/goose). [providers]: /docs/getting-started/providers diff --git a/documentation/docs/quickstart.md b/documentation/docs/quickstart.md index 2ba1478d9789..0690e72d1ade 100644 --- a/documentation/docs/quickstart.md +++ b/documentation/docs/quickstart.md @@ -98,7 +98,7 @@ Let's begin 🚀 Learn about prerequisites in the [installation guide](/docs/getting-started/installation). :::info PATH Warning And Keyring - If you see a PATH warning after installation, you'll need to add Goose to your PATH before running `goose configure`. See the [Windows CLI installation instructions](/docs/getting-started/installation) for detailed steps. + If you see a PATH warning after installation, you'll need to add goose to your PATH before running `goose configure`. See the [Windows CLI installation instructions](/docs/getting-started/installation) for detailed steps. If prompted during configuration, choose to not store to keyring. If you encounter keyring errors, see the [Windows setup instructions](/docs/getting-started/installation#set-llm-provider) for more information. ::: @@ -110,55 +110,121 @@ Let's begin 🚀 ## Configure Provider -Goose works with [supported LLM providers](/docs/getting-started/providers) that give Goose the AI intelligence it needs to understand your requests. On first use, you'll be prompted to configure your preferred provider. +goose works with [supported LLM providers](/docs/getting-started/providers) that give goose the AI intelligence it needs to understand your requests. On first use, you'll be prompted to configure your preferred provider. - - - On the welcome screen, choose `Automatic setup with Tetrate Agent Router`. - - Goose will open a browser for you to authenticate. + + On the welcome screen, you have three options: + - **Automatic setup with [Tetrate Agent Router](https://tetrate.io/products/tetrate-agent-router-service)** + - **Automatic Setup with [OpenRouter](https://openrouter.ai/)** + - **Other Providers** + + For this quickstart, choose `Automatic setup with Tetrate Agent Router`. Tetrate provides access to multiple AI models with built-in rate limiting and automatic failover. For more information about OpenRouter or other providers, see [Configure LLM Provider](/docs/getting-started/providers). + + goose will open a browser for you to authenticate with Tetrate, or create a new account if you don't have one already. When you return to the goose desktop app, you're ready to begin your first session. - :::info Free Credits Offer - You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through Goose. This offer is available to both new and existing Tetrate users and is valid through October 2, 2025. - ::: - - Tetrate provides access to multiple AI models with built-in rate limiting and automatic failover. If you prefer a different provider, choose automatic setup with OpenRouter or manually configure a provider. + :::info Free Credits Offer + You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through goose. This offer is available to both new and existing Tetrate users. + ::: - - - On the welcome screen, choose `Tetrate Agent Router Service Login`. Use the up and down arrow keys to navigate the options, then press `Enter` to select. - - Goose will open a browser for you to authenticate. - + + 1. In your terminal, run the following command: + + ```sh + goose configure + ``` + + 2. Select `Configure Providers` from the menu and press Enter. + + ``` + ┌ goose-configure + │ + ◆ What would you like to configure? + │ ● Configure Providers (Change provider or update credentials) + │ ○ Add Extension + │ ○ Toggle Extensions + │ ○ Remove Extension + │ ○ Goose Settings + └ + ``` + 3. Choose a model provider. For this quickstart, select `Tetrate Agent Router Service` and press Enter. Tetrate provides access to multiple AI models with built-in rate limiting and automatic failover. For information about other providers, see [Configure LLM Provider](/docs/getting-started/providers). + + ``` + ┌ goose-configure + │ + ◇ What would you like to configure? + │ Configure Providers + │ + ◆ Which model provider should we use? + │ ○ Anthropic + │ ○ Azure OpenAI + │ ○ Amazon Bedrock + │ ○ Claude Code + │ ○ Databricks + │ ○ Gemini CLI + | ● Tetrate Agent Router Service (Enterprise router for AI models) + │ ○ ... + └ + ``` :::info Free Credits Offer - You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through Goose. This offer is available to both new and existing Tetrate users and is valid through October 2, 2025. + You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through goose. This offer is available to both new and existing Tetrate users. ::: - Tetrate provides access to multiple AI models with built-in rate limiting and automatic failover. If you prefer a different provider, choose automatic setup with OpenRouter or manually configure a provider. - + 4. Enter your API key (and any other configuration details) when prompted. + + ``` + ┌ goose-configure + │ + ◇ What would you like to configure? + │ Configure Providers + │ + ◇ Which model provider should we use? + │ Tetrate Agent Router Service + │ + ◆ Provider Tetrate Agent Router Service requires TETRATE_API_KEY, please enter a value + │ ▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪▪ + └ + ``` + 5. Select or search for the model you want to use. + ``` + │ + ◇ Model fetch complete + │ + ◆ Select a model: + │ ○ Search all models... + │ ○ gemini-2.5-pro + │ ○ gemini-2.0-flash + | ○ gemini-2.0-flash-lite + │ ● gpt-5 (Recommended) + | ○ gpt-5-mini + | ○ gpt-5-nano + | ○ gpt-4.1 + │ + ◓ Checking your configuration... + └ Configuration saved successfully + ``` ## Start Session -Sessions are single, continuous conversations between you and Goose. Let's start one. +Sessions are single, continuous conversations between you and goose. Let's start one. - + After choosing an LLM provider, click the `Home` button in the sidebar. - Type your questions, tasks, or instructions directly into the input field, and Goose will immediately get to work. + Type your questions, tasks, or instructions directly into the input field, and goose will immediately get to work. - + 1. Make an empty directory (e.g. `goose-demo`) and navigate to that directory from the terminal. 2. To start a new session, run: ```sh goose session ``` - :::tip Goose Web - CLI users can also start a session in [Goose Web](/docs/guides/goose-cli-commands#web), a web-based chat interface: + :::tip goose Web + CLI users can also start a session in [goose Web](/docs/guides/goose-cli-commands#web), a web-based chat interface: ```sh goose web --open ``` @@ -169,31 +235,31 @@ Sessions are single, continuous conversations between you and Goose. Let's start ## Write Prompt -From the prompt, you can interact with Goose by typing your instructions exactly as you would speak to a developer. +From the prompt, you can interact with goose by typing your instructions exactly as you would speak to a developer. -Let's ask Goose to make a tic-tac-toe game! +Let's ask goose to make a tic-tac-toe game! ``` create an interactive browser-based tic-tac-toe game in javascript where a player competes against a bot ``` -Goose will create a plan and then get right to work on implementing it. Once done, your directory should contain a JavaScript file as well as an HTML page for playing. +goose will create a plan and then get right to work on implementing it. Once done, your directory should contain a JavaScript file as well as an HTML page for playing. ## Enable an Extension -While you're able to manually navigate to your working directory and open the HTML file in a browser, wouldn't it be better if Goose did that for you? Let's give Goose the ability to open a web browser by enabling the [`Computer Controller` extension](/docs/mcp/computer-controller-mcp). +While you're able to manually navigate to your working directory and open the HTML file in a browser, wouldn't it be better if goose did that for you? Let's give goose the ability to open a web browser by enabling the [`Computer Controller` extension](/docs/mcp/computer-controller-mcp). - + 1. Click the button in the top-left to open the sidebar. 2. Click `Extensions` in the sidebar menu. 3. Toggle the `Computer Controller` extension to enable it. This extension enables webscraping, file caching, and automations. 4. Return to your session to continue. - 5. Now that Goose has browser capabilities, let's ask it to launch your game in a browser: + 5. Now that goose has browser capabilities, let's ask it to launch your game in a browser: - + 1. End the current session by entering `Ctrl+C` so that you can return to the terminal's command prompt. 2. Run the configuration command ```sh @@ -217,11 +283,11 @@ While you're able to manually navigate to your working directory and open the HT │ └ Enabled computercontroller extension ``` - 4. Now that Goose has browser capabilities, let's resume your last session: + 4. Now that goose has browser capabilities, let's resume your last session: ```sh goose session -r ``` - 5. Ask Goose to launch your game in a browser: + 5. Ask goose to launch your game in a browser: @@ -233,10 +299,10 @@ Go ahead and play your game, I know you want to 😂 ... good luck! ## Next Steps -Congrats, you've successfully used Goose to develop a web app! 🎉 +Congrats, you've successfully used goose to develop a web app! 🎉 Here are some ideas for next steps: -* Continue your session with Goose and improve your game (styling, functionality, etc). -* Browse other available [extensions](/extensions) and install more to enhance Goose's functionality even further. -* Provide Goose with a [set of hints](/docs/guides/using-goosehints) to use within your sessions. +* Continue your session with goose and improve your game (styling, functionality, etc). +* Browse other available [extensions](/extensions) and install more to enhance goose's functionality even further. +* Provide goose with a [set of hints](/docs/guides/using-goosehints) to use within your sessions. diff --git a/documentation/src/components/OnboardingProviderSetup.js b/documentation/src/components/OnboardingProviderSetup.js index 3922db2ead11..c515d089da95 100644 --- a/documentation/src/components/OnboardingProviderSetup.js +++ b/documentation/src/components/OnboardingProviderSetup.js @@ -5,7 +5,7 @@ export const OnboardingProviderSetup = () => { <>
  • - Tetrate Agent Router - One-click OAuth authentication provides instant access to multiple AI models, starting credits, and built-in rate limiting. + Tetrate Agent Router - One-click OAuth authentication provides instant access to multiple AI models, starting credits, and built-in rate limiting. See the goose quickstart guide for a walkthrough of this setup.
    @@ -18,12 +18,12 @@ export const OnboardingProviderSetup = () => {
    -

    You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through Goose. This offer is available to both new and existing Tetrate users and is valid through October 2, 2025.

    +

    You'll receive $10 in free credits the first time you automatically authenticate with Tetrate through Goose. This offer is available to both new and existing Tetrate users.

  • OpenRouter - One-click OAuth authentication provides instant access to multiple AI models with built-in rate limiting.
  • -
  • Other Providers - Choose from ~20 supported providers including OpenAI, Anthropic, Google Gemini, and others through manual configuration. Be ready to provide your API key.
  • +
  • Other Providers - Choose from a selection of ~20 supported providers including OpenAI, Anthropic, Google Gemini, and others through manual configuration. If you don't see your provider in the list, you can add a custom provider. Be ready to provide your API key, API Host address, or other optional parameters depending on provider.
);