Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when running scripts/pipeline/interactive.py #145

Closed
ericchansen opened this issue May 26, 2018 · 8 comments
Closed

Error when running scripts/pipeline/interactive.py #145

ericchansen opened this issue May 26, 2018 · 8 comments

Comments

@ericchansen
Copy link

Anyone seen this and can point me in the right direction?

(drqa) eric@WIN-DJKY13BTNQ:~/repos/DrQA$ python scripts/pipeline/interactive.py
05/26/2018 12:45:57 PM: [ Running on CPU only. ]
05/26/2018 12:45:57 PM: [ Initializing pipeline... ]
05/26/2018 12:45:57 PM: [ Initializing document ranker... ]
05/26/2018 12:45:57 PM: [ Loading /home/eric/repos/DrQA/data/wikipedia/docs-tfidf-ngram=2-hash=16777216-tokenizer=simple.npz ]
05/26/2018 12:48:46 PM: [ Initializing document reader... ]
05/26/2018 12:48:46 PM: [ Loading model /home/eric/repos/DrQA/data/reader/multitask.mdl ]
05/26/2018 12:48:55 PM: [ Initializing tokenizers and document retrievers... ]
Traceback (most recent call last):
  File "scripts/pipeline/interactive.py", line 70, in <module>
    tokenizer=args.tokenizer
  File "/home/eric/repos/DrQA/drqa/pipeline/drqa.py", line 140, in __init__
    initargs=(tok_class, tok_opts, db_class, db_opts, fixed_candidates)
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/context.py", line 119, in Pool
    context=self.get_context())
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/pool.py", line 174, in __init__
    self._repopulate_pool()
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/pool.py", line 239, in _repopulate_pool
    w.start()
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/process.py", line 105, in start
    self._popen = self._Popen(self)
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/context.py", line 277, in _Popen
    return Popen(process_obj)
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/home/eric/anaconda3/envs/drqa/lib/python3.6/multiprocessing/popen_fork.py", line 66, in _launch
    self.pid = os.fork()
OSError: [Errno 22] Invalid argument
@ajfisch
Copy link
Contributor

ajfisch commented May 27, 2018

You’re not running windows right?

@ericchansen
Copy link
Author

Ubuntu subsystem in Windows 10.

Windows 10 Version 1709 (OS Build 16299.431)
Ubuntu xenial 16.04.3 LTS

$ python -c "import os; print(os.fork())"
127
0

@ajfisch
Copy link
Contributor

ajfisch commented May 28, 2018

I'm not sure. A guess might be it's a disguised memory error. Does the standard tokenizer run as expected on multiple threads (i.e. running preprocess or something)?

@ericchansen
Copy link
Author

ericchansen commented May 29, 2018

scripts/reader/preprocess.py, scripts/reader/interactive.py and scripts/retriever/interactive.py seem to be working as intended.

I reinstalled Java (install with sudo apt-get install default-jre) and did conda update --all. Following an unlikely train of thought, I ran conda install psutil. None of this seems to have mattered.

@ericchansen
Copy link
Author

Created an Azure data science VM and walked through the README.md examples. Everything works in this environment.

Maybe we can chalk this up to a problem with the Ubuntu subsystem in Windows? I'm not sure how to follow up on this issue.

@ajfisch
Copy link
Contributor

ajfisch commented May 30, 2018

Yeah, I'm sorry I'm not much help! It seems to be some subtle compatibility issue; out of my zone though. Using the VM is probably best.

@ajfisch
Copy link
Contributor

ajfisch commented Jun 6, 2018

Closing. If you find out more, please do update the issue.

@ajfisch ajfisch closed this as completed Jun 6, 2018
@idrissbachali
Copy link

i've got the same issue, i'm wondering if i can use google colab GPU maybe?is it possible?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants