Skip to content

Official code for "DaisyRec 2.0: Benchmarking Recommendation for Rigorous Evaluation" (TPAMI2022) and "Are We Evaluating Rigorously? Benchmarking Recommendation for Reproducible Evaluation and Fair Comparison" (RecSys2020)

License

Notifications You must be signed in to change notification settings

AmazingDD/daisyRec

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyPI - Python Version Version GitHub repo size GitHub arXiv

Overview

daisyRec is a Python toolkit developed for benchmarking top-N recommendation task. The name DAISY stands for multi-Dimension fAir comparIson for recommender SYstem.

The figure below shows the overall framework of DaisyRec-v2.0.

This repository is used for publishing. If you are interested in details of our experiments ranking results, try to reach this repo file.

We really appreciate these repositories to help us improve the code efficiency:

How to Run

Make sure you have a CUDA enviroment to accelarate since the deep-learning models could be based on it.

1. Install from pip

pip install daisyRec

2. Clone from github

git clone https://github.com/AmazingDD/daisyRec.git && cd daisyRec
  • Example codes are listed in run_examples, try to refer them and find out how to use daisy; you can also implement these codes by moving them into daisyRec/.

  • The GUI Command Generator for test.py and tune.py, which can assist you to quikly write arguments and run the fair comparison experiments, is now available here.

    The generated command will be like this:

    python tune.py --param1=20 --param2=30 ....
    python test.py --param1=20 --param2=30 ....
    

    We highly recommend you to implement the code with our GUI firstly!

Documentation

The documentation of DaisyRec is available here, which provides detailed explanations for all arguments.

Implemented Algorithms

Models in daisyRec only take triples <user, item, rating> into account, so FM-related models will be specialized accrodingly. Below are the algorithms implemented in daisyRec. More baselines will be added later.

Model Publication
MostPop A re-visit of the popularity baseline in recommender systems
ItemKNN Item-based top-N recommendation algorithms
EASE Embarrassingly Shallow Autoencoders for Sparse Data
PureSVD Top-n recommender system via matrix completion
SLIM SLIM: Sparse Linear Methods for Top-N Recommender Systems
MF Matrix factorization techniques for recommender systems
FM Factorization Machines
NeuMF Neural Collaborative Filtering
NFM Neural Factorization Machines for Sparse Predictive Analytics
NGCF Neural Graph Collaborative Filtering
Multi-VAE Variational Autoencoders for Collaborative Filtering
Item2Vec Item2vec: neural item embedding for collaborative filtering
LightGCN LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation

Datasets

You can download experiment data, and put them into the data folder. All data are available in links below:

Cite

Please cite both of the following papers if you use DaisyRec in a research paper in any way (e.g., code and ranking results):

@inproceedings{sun2020are,
  title={Are We Evaluating Rigorously? Benchmarking Recommendation for Reproducible Evaluation and Fair Comparison},
  author={Sun, Zhu and Yu, Di and Fang, Hui and Yang, Jie and Qu, Xinghua and Zhang, Jie and Geng, Cong},
  booktitle={Proceedings of the 14th ACM Conference on Recommender Systems},
  year={2020}
}

@article{sun2022daisyrec,
  title={DaisyRec 2.0: Benchmarking Recommendation for Rigorous Evaluation},
  author={Sun, Zhu and Fang, Hui and Yang, Jie and Qu, Xinghua and Liu, Hongyang and Yu, Di and Ong, Yew-Soon and Zhang, Jie},
  journal={arXiv preprint arXiv:2206.10848},
  year={2022}
}

About

Official code for "DaisyRec 2.0: Benchmarking Recommendation for Rigorous Evaluation" (TPAMI2022) and "Are We Evaluating Rigorously? Benchmarking Recommendation for Reproducible Evaluation and Fair Comparison" (RecSys2020)

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages