Pinned Loading
-
Chinese-PreTrained-BERT
Chinese-PreTrained-BERT PublicWe released BERT-wwm, a Chinese pre-training model based on Whole Word Masking technology, and models closely related to this technology. 我们发布了基于全词遮罩(Whole Word Masking)技术的中文预训练模型BERT-wwm,以及与此技术密切相…
-
Chinese-PreTrained-GPT
Chinese-PreTrained-GPT Public我们发布了基于Generative Pre-Training技术的中文预训练模型CP-GPT,以及与此技术密切相关的模型
Jupyter Notebook
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.