Skip to content

very confused about the environment variable configuration for AI features. #46

@qiurufeng

Description

@qiurufeng

Especially this statement: "Connect the provider name and the model name with a ">", such as openai:gpt-4o-mini, openrouter:openai/gpt-4o-mini"
The text says to use ">", but the examples show ":".
I've actually tried both formats, but neither seems to successfully enable the LLMs-related functionality.
My configuration file is:

# AI feature
ANYCRAWL_AI_CONFIG_PATH=
# if set ANYCRAWL_AI_CONFIG_PATH, the coming ai env will be disabled.
# AI for extracting
# format: openai>gpt-40
# Connect the provider name and the model name with a ">", such as openai:gpt-4o-mini, openrouter:openai/gpt-4o-mini
DEFAULT_LLM_MODEL=custom>glm-4.5-air
DEFAULT_EXTRACT_MODEL=custom>glm-4.5-air

# Support configuring many providers, and decide by model name.
# Provide your OpenAI API key here to enable AI features
#OPENAI_API_KEY=

# OpenRouter
# OPENROUTER_API_KEY=

# OpenAI-compatible API, the provider name is: "custom"
OPENAI_COMPATIBLE_BASE_URL=https://open.bigmodel.cn/api/paas/v4
OPENAI_COMPATIBLE_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxx

ANYCRAWL_EXTRACT_JSON_CREDITS=5

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions