Skip to content

Commit 61f2a14

Browse files
author
William Bakst
committed
Merge branch 'main' into release/v1.10
2 parents 3f3cbe3 + 5a5473c commit 61f2a14

12 files changed

+60
-0
lines changed

docs/learn/agents.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Agents
27

38
> __Definition__: a person who acts on behalf of another person or group

docs/learn/async.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Async
27

38
Asynchronous programming is a crucial concept when building applications with LLMs (Large Language Models) using Mirascope. This feature allows for efficient handling of I/O-bound operations (e.g., API calls), improving application responsiveness and scalability. Mirascope utilizes the [asyncio](https://docs.python.org/3/library/asyncio.html) library to implement asynchronous processing.

docs/learn/calls.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 3
4+
---
5+
16
# Calls
27

38
!!! mira ""

docs/learn/chaining.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Chaining
27

38
!!! mira ""

docs/learn/evals.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Evals: Evaluating LLM Outputs
27

38
!!! mira ""

docs/learn/json_mode.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# JSON Mode
27

38
!!! mira ""

docs/learn/output_parsers.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Output Parsers
27

38
!!! mira ""

docs/learn/prompts.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 3
4+
---
5+
16
# Prompts
27

38
When working with Large Language Model (LLM) APIs, the "prompt" is generally a list of messages where each message has a particular role. These prompts are the foundation of effectively working with LLMs, so Mirascope provides powerful tools to help you create, manage, and optimize your prompts for various LLM interactions.

docs/learn/response_models.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Response Models
27

38
!!! mira ""

docs/learn/retries.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Retries
27

38
Making an API call to a provider can fail due to various reasons, such as rate limits, internal server errors, validation errors, and more. This makes retrying calls extremely important when building robust systems.

docs/learn/streams.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Streams
27

38
!!! mira ""

docs/learn/tools.md

+5
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
---
2+
search:
3+
boost: 2
4+
---
5+
16
# Tools
27

38
{% set tool_methods = [["base_tool", "BaseTool"], ["function", "Function"]] %}

0 commit comments

Comments
 (0)