- requires sepcifiying homepath.
- requires tensorflow, BERT and Pylucene, Sklearn.
python automatic_fact_verification_8395/prepare.py
Important assumpation: the wiki titles are the entities themselves.
automatic_fact_verification_8395/main.ipynb
automatic_fact_verification_8395/common_scoring.ipynb
automatic_fact_verification_8395/pylucene/pylucene-title-content-based.py
Run the pylucence based searching. Save result as xxx.pkl to intermidate_filepath.
python pylucene-title-content-based.py --help
python pylucene-title-content-based.py --firsttime=True --dataset_type='devset' --k=100
evaluate the result from pylucence
python evaluate.py --filepath='/home/ubuntu/workspace/codelab/intermediate_data/'
Additionally, for Pylucene results, we remove the wiki titles which does not have word appeared in the claim.
automatic_fact_verification_8395/main.ipynb
The training and prediction scripts of BERT is in automatic_fact_verification_8395/bert/RUN_BERT.ipynb
The training and prediction scripts of BERT is in automatic_fact_verification_8395/bert/RUN_BERT.ipynb
automatic_fact_verification_8395/backup/bow-RandomForest.ipynb.
The training and prediction scripts of BERT is in automatic_fact_verification_8395/bert/RUN_BERT.ipynb