Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问max_context_length是什么 #67

Open
volagold opened this issue Jul 28, 2023 · 3 comments
Open

请问max_context_length是什么 #67

volagold opened this issue Jul 28, 2023 · 3 comments

Comments

@volagold
Copy link

如题,默认值是512,请问这是指生成下一个token时只看512个token长度的上文吗?

    def chat(
        self,
        history: List[str],
        *,
        max_length: int = 2048,
        max_context_length: int = 512,
        do_sample: bool = True,
        top_k: int = 0,
        top_p: float = 0.7,
        temperature: float = 0.95,
        num_threads: int = 0,
    ) -> str:
        gen_config = _C.GenerationConfig(
            max_length=max_length,
            max_context_length=max_context_length,
            do_sample=do_sample,
            top_k=top_k,
            top_p=top_p,
            temperature=temperature,
            num_threads=num_threads,
        )
@li-plus
Copy link
Owner

li-plus commented Aug 7, 2023

max_context_length 是输入(prompt)的最大长度,在cpu上太长的输入会导致首字符延迟太大,如果用gpu可以调大一些,另外max_length是输入+输出的最大长度。

@volagold
Copy link
Author

volagold commented Aug 7, 2023

好的,谢谢解答

@compass-star
Copy link

max_context_length 是输入(prompt)的最大长度,在cpu上太长的输入会导致首字符延迟太大,如果用gpu可以调大一些,另外max_length是输入+输出的最大长度。

请问是否可以这么理解max_length-max_context_length=输出的最大长度即max_new_tokens?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants