Skip to content

Commit

Permalink
Merge branch 'main' into feat-context
Browse files Browse the repository at this point in the history
  • Loading branch information
ogabrielluiz authored Nov 8, 2024
2 parents cb61de5 + b898d8a commit 06d79bf
Show file tree
Hide file tree
Showing 63 changed files with 1,876 additions and 13,250 deletions.
6 changes: 6 additions & 0 deletions .github/workflows/codspeed.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,15 @@ name: Run benchmarks

on:
push:
paths:
- "src/backend/base/**"
- "src/backend/tests/performance/**"
branches:
- "main" # or "master"
pull_request:
paths:
- "src/backend/base/**"
- "src/backend/tests/performance/**"
workflow_dispatch:

jobs:
Expand Down
161 changes: 65 additions & 96 deletions docs/docs/Configuration/configuration-backend-only.md
Original file line number Diff line number Diff line change
@@ -1,154 +1,123 @@
---
title: Backend-Only
title: Run Langflow in backend-only mode
sidebar_position: 4
slug: /configuration-backend-only
---

Langflow can run in `--backend-only` mode to expose a Langflow app as an API endpoint, without running the frontend UI.
This is also known as "headless" mode. Running Langflow without the frontend is useful for automation, testing, and situations where you just need to serve a flow as a workload without creating a new flow in the UI.

To run Langflow in backend-only mode, pass the `--backend-only` flag at startup.

:::info

This page may contain outdated information. It will be updated as soon as possible.

:::




You can run Langflow in `--backend-only` mode to expose your Langflow app as an API, without running the frontend UI.

```python
python3 -m langflow run --backend-only
```

Start langflow in backend-only mode with `python3 -m langflow run --backend-only`.
The terminal prints `Welcome to ⛓ Langflow`, and Langflow will now serve requests to its API without the frontend running.

## Set up a basic prompting flow in backend-only mode

The terminal prints `Welcome to ⛓ Langflow`, and a blank window opens at `http://127.0.0.1:7864/all`.
Langflow will now serve requests to its API without the frontend running.
This example shows you how to set up a [Basic Prompting flow](/starter-projects-basic-prompting) as an endpoint in backend-only mode.
However, you can use these same instructions as guidelines for using any type of flow in backend-only mode.

### Prerequisites

## Prerequisites {#81dfa9407ed648889081b9d08b0e5cfe}
- [Langflow is installed](/getting-started-installation)
- [You have an OpenAI API key](https://platform.openai.com/)
- [You have a Langflow Basic Prompting flow](/starter-projects-basic-prompting)

- [Langflow installed](/getting-started-installation)
- [OpenAI API key](https://platform.openai.com/)
- [A Langflow flow created](/starter-projects-basic-prompting)
### Get your flow's ID

## Download your flow's curl call {#d2cf1b694e4741eca07fd9806516007b}
This guide assumes you have created a [Basic Prompting flow](/starter-projects-basic-prompting) or have another working flow available.

1. Click API.
2. Click **curl** > **Copy code** and save the code to your local machine.
1. In the Langflow UI, click **API**.
2. Click **curl** > **Copy code** to copy the curl command.
This command will POST input to your flow's endpoint.
It will look something like this:

```text
curl -X POST \\
"<http://127.0.0.1:7864/api/v1/run/ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef?stream=false>" \\
-H 'Content-Type: application/json'\\
curl -X POST \
"http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0?stream=false" \
-H 'Content-Type: application/json'\
-d '{"input_value": "message",
"output_type": "chat",
"input_type": "chat",
"tweaks": {
"Prompt-kvo86": {},
"OpenAIModel-MilkD": {},
"ChatOutput-ktwdw": {},
"ChatInput-xXC4F": {}
"ChatInput-8a86T": {},
"Prompt-pKfl9": {},
"ChatOutput-WcGpD": {},
"OpenAIModel-5UyvQ": {}
}}'
```

The flow ID in this example is `fff8dcaa-f0f6-4136-9df0-b7cb38de42e0`, a UUID generated by Langflow and used in the endpoint URL.
See [Project & General Settings](/settings-project-general-settings) to change the endpoint.

Note the flow ID of `ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef`. You can find this ID in the UI as well to ensure you're querying the right flow.
3. To stop Langflow, press **Ctrl+C**.

### Start Langflow in backend-only mode

## Start Langflow in backend-only mode {#f0ba018daf3041c39c0d226dadf78d35}
1. Start Langflow in backend-only mode.

1. Stop Langflow with Ctrl+C.
2. Start langflow in backend-only mode with `python3 -m langflow run --backend-only`.
The terminal prints `Welcome to ⛓ Langflow`, and a blank window opens at `http://127.0.0.1:7864/all`.
Langflow will now serve requests to its API.
3. Run the curl code you copied from the UI.
```python
python3 -m langflow run --backend-only
```

The terminal prints `Welcome to ⛓ Langflow`.
Langflow is now serving requests to its API.

2. Run the curl code you copied from the UI.
You should get a result like this:

```shell
{"session_id":"ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880","outputs":[{"inputs":{"input_value":"hi, are you there?"},"outputs":[{"results":{"result":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?"},"artifacts":{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-ktwdw"}],"component_display_name":"Chat Output","component_id":"ChatOutput-ktwdw","used_frozen_result":false}]}]}%

```

This confirms Langflow is receiving your POST request, running the flow, and returning the result without running the frontend.

Again, note that the flow ID matches.
Langflow is receiving your POST request, running the flow, and returning the result, all without running the frontend. Cool!
You can interact with this endpoint using the other options in the **API** menu, including the Python and Javascript APIs.

### Query the Langflow endpoint with a Python script

## Download your flow's Python API call {#5923ff9dc40843c7a22a72fa6c66540c}
Using the same flow ID, run a Python sample script to send a query and get a prettified JSON response back.


Instead of using curl, you can download your flow as a Python API call instead.

1. Click API.
2. Click **Python API** &gt; **Copy code** and save the code to your local machine.
The code will look something like this:
1. Create a Python file and name it `langflow_api_demo.py`.

```python
import requests
from typing import Optional

BASE_API_URL = "<http://127.0.0.1:7864/api/v1/run>"
FLOW_ID = "ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef"
# You can tweak the flow by adding a tweaks dictionary
# e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}

def run_flow(message: str,
flow_id: str,
output_type: str = "chat",
input_type: str = "chat",
tweaks: Optional[dict] = None,
api_key: Optional[str] = None) -> dict:
"""Run a flow with a given message and optional tweaks.
:param message: The message to send to the flow
:param flow_id: The ID of the flow to run
:param tweaks: Optional tweaks to customize the flow
:return: The JSON response from the flow
"""
api_url = f"{BASE_API_URL}/{flow_id}"
payload = {
"input_value": message,
"output_type": output_type,
"input_type": input_type,
}
headers = None
if tweaks:
payload["tweaks"] = tweaks
if api_key:
headers = {"x-api-key": api_key}
response = requests.post(api_url, json=payload, headers=headers)
return response.json()

# Setup any tweaks you want to apply to the flow

message = "message"

print(run_flow(message=message, flow_id=FLOW_ID))
import json

```
def query_langflow(message):
url = "http://127.0.0.1:7861/api/v1/run/fff8dcaa-f0f6-4136-9df0-b7cb38de42e0"
headers = {"Content-Type": "application/json"}
data = {"input_value": message}

response = requests.post(url, headers=headers, json=data)
return response.json()

3. Run your Python app:
user_input = input("Enter your message: ")
result = query_langflow(user_input)


```shell
python3 app.py
print(json.dumps(result, indent=2))
```
2. Run the script.

```python
python langflow_api_demo.py
```

The result is similar to the curl call:
3. Enter your message when prompted.
You will get a prettified JSON response back containing a response to your message.

### Configure host and ports in backend-only mode

```json
{'session_id': 'ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880', 'outputs': [{'inputs': {'input_value': 'message'}, 'outputs': [{'results': {'result': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!"}, 'artifacts': {'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI'}, 'messages': [{'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI', 'component_id': 'ChatOutput-ktwdw'}], 'component_display_name': 'Chat Output', 'component_id': 'ChatOutput-ktwdw', 'used_frozen_result': False}]}]}
To change the host and port, pass the values as additional flags.

```python
python -m langflow run --host 127.0.0.1 --port 7860 --backend-only
```


Your Python app POSTs to your Langflow server, and the server runs the flow and returns the result.


See [API](https://www.notion.so/administration/api) for more ways to interact with your headless Langflow server.

Loading

0 comments on commit 06d79bf

Please sign in to comment.