Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix styles in fast_tokenizer #5217

Merged
merged 1 commit into from
Mar 14, 2023
Merged

Conversation

sijunhe
Copy link
Collaborator

@sijunhe sijunhe commented Mar 14, 2023

PR types

PR changes

Description

@sijunhe sijunhe requested a review from joey12300 March 14, 2023 05:59
@paddle-bot
Copy link

paddle-bot bot commented Mar 14, 2023

Thanks for your contribution!

@sijunhe sijunhe changed the title Fix styles in fast_tokenization Fix styles in fast_tokenizer Mar 14, 2023
@codecov
Copy link

codecov bot commented Mar 14, 2023

Codecov Report

Merging #5217 (2281a08) into develop (2ee028b) will increase coverage by 0.31%.
The diff coverage is n/a.

@@             Coverage Diff             @@
##           develop    #5217      +/-   ##
===========================================
+ Coverage    51.60%   51.91%   +0.31%     
===========================================
  Files          468      469       +1     
  Lines        66642    66674      +32     
===========================================
+ Hits         34389    34616     +227     
+ Misses       32253    32058     -195     

see 5 files with indirect coverage changes

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Contributor

@joey12300 joey12300 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@sijunhe sijunhe merged commit b4b2234 into PaddlePaddle:develop Mar 14, 2023
@sijunhe sijunhe deleted the fix-styles-ft branch March 14, 2023 07:01
@nimo2021
Copy link

nimo2021 commented Mar 23, 2024

err

你好,这里有个命名参数错误with_added_tokens,看了C的封装,应该为with_added_vocabulary

当前 fast-tokenizer-python-1.0.2.post1
with_added_tokens=with_added_tokens

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants