You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then, all you need to do is to continue training the pretrained model following the same commands for pretraining described in GitHub repo for BERT except using the pretrained model as the initial checkpoint.
Put the checkpoint files of the fine-tuned model under model/fine-tuned/. Then, set the "fine-tuned" in the faspell_configs.json to the path of the the fine-tuned model.
看谷歌的BERT模型README中的 Pre-training with BERT,具体根据自己需求来修改参数,就能得到model.ckpt等一系列文件。放到model/fine-tuned就好
···
$ cd bert_modified
$ python create_data.py -f /path/to/training/data/file
$ python create_tf_record.py --input_file correct.txt --wrong_input_file wrong.txt --output_file tf_examples.tfrecord --vocab_file ../model/pre-trained/vocab.txt
···
经过以上的步骤后,获得微调样本tf_examples.tfrecord,我想知道,这个微调样本的用处以及如何使用
The text was updated successfully, but these errors were encountered: