Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

exited with non-zero exit code #2

Closed
JinqunHuang opened this issue Sep 3, 2019 · 13 comments
Closed

exited with non-zero exit code #2

JinqunHuang opened this issue Sep 3, 2019 · 13 comments

Comments

@JinqunHuang
Copy link

JinqunHuang commented Sep 3, 2019

Dear MicroPro Devs,

I installed and test the MicrobialPip as below, then get this error.
And this is my config file of Dependencies.
Source.sh.txt

But when I retry the centrifuge step, it runs successfully.

centrifuge -x test/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report
Time loading forward index: 00:00:00
Time loading reference: 00:00:00
Multiseed full-index search: 00:00:05
Time searching: 00:00:05
report file 1_centrifuge/2_report
Number of iterations in EM algorithm: 0
Probability diff. (P - P_prev) in the last iteration: 0
Calculating abundance: 00:00:00
Overall time: 00:00:06

Any advice will be appreciatived !

Best,
Jinqun Huang

$ cd MicroPro/MicrobialPip
$ chmod 755 utils/*
$ R --no-save --file=scripts/parameters.R --args P test/data _1 _2 .fq test 1000
$ snakemake -p -s Snakefile-P

Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
The flag 'directory' used in rule index is only valid for outputs, not inputs.
The flag 'directory' used in rule mapping is only valid for outputs, not inputs.
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-map-to-ctg-P ignored
The flag 'directory' used in rule metbat2_binning is only valid for outputs, not inputs.
The flag 'directory' used in rule bin_count is only valid for outputs, not inputs.
The flag 'directory' used in rule unknown_abund is only valid for outputs, not inputs.
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 1
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 bin_count
1 cat_unmapped_reads_1
1 cat_unmapped_reads_2
2 centrifuge_alignment
1 ctf_abund
1 index
2 mapping
1 megahit_assembly
1 metbat2_binning
1 unknown_abund
2 unmapped_reads_from_ctf
1 wc_all_reads
1 wc_unmapped_reads
17

[Tue Sep 3 20:11:12 2019]
rule centrifuge_alignment:
input: test/data/2_1.fq, test/data/2_2.fq
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log
jobid: 4
benchmark: benchmarks/1_centrifuge/2.tsv
wildcards: sample=2

centrifuge -x test/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
[Tue Sep 3 20:11:12 2019]
Error in rule centrifuge_alignment:
jobid: 4
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log (check log file(s) for error message)
shell:
centrifuge -x test/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
(exited with non-zero exit code)

Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/.snakemake/log/2019-09-03T201111.945521.snakemake.log

@JinqunHuang
Copy link
Author

Dear MicroPro Devs,

I installed and test the MicrobialPip as below, then get this error.
And this is my config file of Dependencies.
Source.sh.txt

Any advice will be appreciatived !

Best,
Jinqun Huang

$ cd MicroPro/MicrobialPip
$ chmod 755 utils/*
$ R --no-save --file=scripts/parameters.R --args P test/data _1 _2 .fq test 1000
$ snakemake -p -s Snakefile-P

Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
The flag 'directory' used in rule index is only valid for outputs, not inputs.
The flag 'directory' used in rule mapping is only valid for outputs, not inputs.
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-map-to-ctg-P ignored
The flag 'directory' used in rule metbat2_binning is only valid for outputs, not inputs.
The flag 'directory' used in rule bin_count is only valid for outputs, not inputs.
The flag 'directory' used in rule unknown_abund is only valid for outputs, not inputs.
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 1
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 bin_count
1 cat_unmapped_reads_1
1 cat_unmapped_reads_2
2 centrifuge_alignment
1 ctf_abund
1 index
2 mapping
1 megahit_assembly
1 metbat2_binning
1 unknown_abund
2 unmapped_reads_from_ctf
1 wc_all_reads
1 wc_unmapped_reads
17

[Tue Sep 3 20:11:12 2019]
rule centrifuge_alignment:
input: test/data/2_1.fq, test/data/2_2.fq
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log
jobid: 4
benchmark: benchmarks/1_centrifuge/2.tsv
wildcards: sample=2

centrifuge -x test/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
[Tue Sep 3 20:11:12 2019]
Error in rule centrifuge_alignment:
jobid: 4
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log (check log file(s) for error message)
shell:
centrifuge -x test/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
(exited with non-zero exit code)

Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/.snakemake/log/2019-09-03T201111.945521.snakemake.log

I also try like this, and get similar error.

By the way,the advfiles of database build following commands:
$ cd <PATH_TO_CENTRIFUGE>
$ centrifuge-download -o taxonomy taxonomy
$ centrifuge-download -o library -m -d "archaea,bacteria,viral" refseq > seqid2taxid.map
$ cat library/
/.fna > input-sequences.fna
$ centrifuge-build -p 4 --conversion-table seqid2taxid.map --taxonomy-tree taxonomy/nodes.dmp --name-table taxonomy/names.dmp input-sequences.fna abv
$ mkdir database
$ mv abv
database
$ rm -r taxonomy library

ll /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/centrifuge/centrifuge-1.0.3-beta/database/
total 26529332
-rw-r--r-- 1 huangjq bc_mg 19964869069 Sep 3 15:35 abv.1.cf
-rw-r--r-- 1 huangjq bc_mg 7198800996 Sep 3 12:12 abv.2.cf
-rw-r--r-- 1 huangjq bc_mg 2360973 Sep 3 10:58 abv.3.cf

$R --no-save --file=scripts/parameters.R --args P test/data _1 _2 .fq /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/centrifuge/centrifuge-1.0.3-beta 1000
$snakemake -p -s Snakefile-P
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
The flag 'directory' used in rule index is only valid for outputs, not inputs.
The flag 'directory' used in rule mapping is only valid for outputs, not inputs.
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-known-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-cross-assembly-P ignored
Multiple include of /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/snake-map-to-ctg-P ignored
The flag 'directory' used in rule metbat2_binning is only valid for outputs, not inputs.
The flag 'directory' used in rule bin_count is only valid for outputs, not inputs.
The flag 'directory' used in rule unknown_abund is only valid for outputs, not inputs.
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 1
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 bin_count
1 cat_unmapped_reads_1
1 cat_unmapped_reads_2
2 centrifuge_alignment
1 ctf_abund
1 index
2 mapping
1 megahit_assembly
1 metbat2_binning
1 unknown_abund
2 unmapped_reads_from_ctf
1 wc_all_reads
1 wc_unmapped_reads
17

[Tue Sep 3 20:38:11 2019]
rule centrifuge_alignment:
input: test/data/2_1.fq, test/data/2_2.fq
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log
jobid: 10
benchmark: benchmarks/1_centrifuge/2.tsv
wildcards: sample=2

centrifuge -x /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/centrifuge/centrifuge-1.0.3-beta/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
[Tue Sep 3 20:38:12 2019]
Error in rule centrifuge_alignment:
jobid: 10
output: 1_centrifuge/2_classification, 1_centrifuge/2_report
log: logs/1_centrifuge/2.log (check log file(s) for error message)
shell:
centrifuge -x /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/centrifuge/centrifuge-1.0.3-beta/database/abv -q -t -1 test/data/2_1.fq -2 test/data/2_2.fq -S 1_centrifuge/2_classification --report-file 1_centrifuge/2_report &> logs/1_centrifuge/2.log
(exited with non-zero exit code)

Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/.snakemake/log/2019-09-03T203811.047772.snakemake.log

@JinqunHuang JinqunHuang changed the title Error in rule centrifuge_alignment exited with non-zero exit code Sep 5, 2019
@zifanzhu
Copy link
Owner

zifanzhu commented Sep 9, 2019

Hi JinqunHuang,

Sorry for the late reply!

It seems that the one you failed and the one that succeeded had different centrifuge reference locations: /hwfssz4/BC_COM_FP/BC_CCP/Micro/software/centrifuge/centrifuge-1.0.3-beta/database/abv (the one you failed) and test/database/abv (the one that succeeded). If you move your database (database/abv*) to the installation directory of centrifuge, it'll work.

Also, you may look at the log located at /logs/1_centrifuge/*log to see if it shows an error of not finding the reference (i.e. those abv* files).

Best,
Zifan

@JinqunHuang
Copy link
Author

JinqunHuang commented Sep 10, 2019 via email

@zifanzhu
Copy link
Owner

Hmm, that's interesting. "logs/1_centrifuge/" being empty makes it hard to debug.

Note that one thing about snakemake is everytime you run a new snakemake, you need to delete the old one. What I want you to try is to delete the MicroPro folder, git clone it again and only run the installation testing. If you get errors, please check the "MicroPro/MicrobialPip/logs/" folder to see if any logs are generated. Each log should be located in the corresponding folder. E.g. centrifuge logs should be in "MicroPro/MicrobialPip/logs/1_centrifuge/".

Let me know if the "MicroPro/MicrobialPip/logs/" folder is still empty.

Zifan

@JinqunHuang
Copy link
Author

Dear zhifan,

I git clone MicroPro justnow and follow the commands as below. I still get the error(exited with non-zero exit code), and there was one log(2.log) in the MicroPro/MicrobialPip/logs/, and it's empty.
The error and the source file were shown as the attachments.

$ git clone https://github.com/zifanzhu/MicroPro.git
$ cd MicroPro/MicrobialPip

$ cd MicroPro/MicrobialPip
$ chmod 755 utils/*
$ R --no-save --file=scripts/parameters.R --args P test/data _1 _2 .fq test 1000
$ snakemake -p -s Snakefile-P

Source.sh.txt
snakemake.log.txt

@zifanzhu
Copy link
Owner

It says in the log that the complete log is in:

/hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/.snakemake/log/2019-09-10T105119.879332.snakemake.log

Could you please send that to me as well?

@JinqunHuang
Copy link
Author

It says in the log that the complete log is in:

/hwfssz4/BC_COM_FP/BC_CCP/Micro/software/MicroPro/MicrobialPip/.snakemake/log/2019-09-10T105119.879332.snakemake.log

Could you please send that to me as well?

The log is as below.

2019-09-10T105119.879332.snakemake.log.txt

@zifanzhu
Copy link
Owner

Thanks JinQun. I'll test it on my end soon.

Zifan

@zifanzhu
Copy link
Owner

Hi JinQun,

Are you using the latest snakemake (5.6.0)? I tried with it on my end and got the same error as yours. It appeared that something in this newest version of snakemaker caused the problem. Uninstall it and re-install with version 5.3.0. Also make sure to install python package "psutil" as well. It should work.

Best,
Zifan

@JinqunHuang
Copy link
Author

Hi zhifan,
I unstall snakemake(5.5.3) and re-install with 5.3.0, I also install python package "psutil".
It worked but get some errors in step 14, with 'res/centrifuge_abundance.rds', probable reason 'No such file or directory'.

2019-09-11T104400.906606.snakemake.log.txt
logs.tar.gz

@zifanzhu
Copy link
Owner

Sorry there's a bug in the 'snake-binning-P' file. I fixed it just now. Could you please replace the old 'snake-binning-P' file with the new one? In your case, you can keep all the old results and run 'snakemake -p -s Snakefile-P' again. Snakemake will skip what you've obtained and only run those parts which are not there. Let me know if you this solution works.

Best,
Zifan

@JinqunHuang
Copy link
Author

JinqunHuang commented Sep 12, 2019

Dear zifan,

I follow you advices and the pipeline worked.

Thanks for your help again.

Best,
JinQun Huang

@zifanzhu
Copy link
Owner

No problem! Thank You as well!

Zifan

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants