Skip to content

Conversation

@iandouglas
Copy link
Contributor

Since .goosehints and every memory stored in the Memory Extension are sent with every request to the LLM, I added notes on how to take lengthy notes/memories, write them into external files, and use shorter rules/memories to access those files as needed. While this will likely cause an extra round of back-and-forth between Goose and the LLM, it should reduce the context window from sending long sets of rules/guidelines unrelated to a given request.

@github-actions
Copy link
Contributor

github-actions bot commented Aug 6, 2025

PR Preview Action v1.6.0
Preview removed because the pull request was closed.
2025-08-06 19:13 UTC

@blackgirlbytes
Copy link
Contributor

blackgirlbytes commented Aug 6, 2025

Since you're already in this file..would you be okay with removing

You can name your context files differently -- e.g. AGENTS.md -- and Goose can still pick them up. Configure the CONTEXT_FILE_NAMES setting!

This wasnt you that added this, but Diane had a ticket on it. You're already here..we might as well remove it.

@blackgirlbytes
Copy link
Contributor

blackgirlbytes commented Aug 6, 2025

what if after this line

A good time to consider adding a `.goosehints` file is when you find yourself repeating prompts, or providing the same kind of instructions multiple times. It's also a great way to provide a lot of context which might be better suited in a file.

We add this:

Goosehints are loaded at the start of your session and become part of the system prompt sent with every request. This means the content of .goosehints contributes to token usage, so keeping it concise can save both cost and processing time.

@blackgirlbytes
Copy link
Contributor

okay i have one more note. What do you think of this? I think the pro tip feels a little tacked on..like an after the fact thing..so in the area where it says

Why Use Memory?
With the Memory extension, you’re not just storing static notes, you’re teaching Goose how to assist you better. Imagine telling Goose:

learn everything about MCP servers and save it to memory.

Later, you can ask:

utilizing our MCP server knowledge help me build an MCP server.

Goose will recall everything you’ve saved as long as you instruct it to remember. This makes it easier to have consistent results when working with Goose.

What if we add

Memories are retrieved dynamically based on your request and sent to the LLM with every prompt. For large or detailed instructions, you can store them in files and instruct Goose to reference those files using a prompt like the one below:

Remember that if I ask for help writing JavaScript, refer to "/path/to/javascript_nodes.txt" and follow the instructions in that file.

@iandouglas iandouglas merged commit bb26521 into main Aug 6, 2025
12 checks passed
@iandouglas iandouglas deleted the iand/hints-memories-reduction branch August 6, 2025 19:11
katzdave added a commit that referenced this pull request Aug 6, 2025
* 'main' of github.com:block/goose:
  Upgrade to MCP-UI ~5.6.2 and handle internalized auto iframe resizing (#3889)
  docs: recipe updates (#3844)
  added notes about reducing context window by referencing external files (#3895)
michaelneale added a commit that referenced this pull request Aug 7, 2025
* main:
  Upgrade to MCP-UI ~5.6.2 and handle internalized auto iframe resizing (#3889)
  docs: recipe updates (#3844)
  added notes about reducing context window by referencing external files (#3895)
  Make the window title reflect what we are doing (#3883)
  additional metrics + Ui implementation (#3871)
  feat: Add session description editing functionality (#3819)
  Update filename in contributing docs (#3866)
  Fix voice dictation provider selection bug (#3862)
  doc: Update supported container runtimes (#3874)
  feat: add OAuth provider abstraction for CLI configuration (#3157)
  Don't ignore lockfiles on linux/windows builds (#3859)
  Use RMCP for StreamableHTTP OAuth support (#3845)
  Try to keep key order for Databricks (#3876)
  Fix OpenAI Provider with GitHub Models (#3875)
  Cmd click open finder (#3807)
  fix: recipe parameter form max height and not scrolling (#3879)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants