Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place #334

Closed
ashishchandr70 opened this issue Jan 10, 2021 · 8 comments
Labels

Comments

@ashishchandr70
Copy link

Hi,

I am using threads.js for a multi-player game. The main thread spawns two separate workers who each execute the same function.

The two worker threads go ahead and do what they need to do and I await a websocket message to resolve a promise which indicates that the processing is complete.

However, when I call Thread.terminate, I immediately get the following error:

FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
 1: 0xa18150 node::Abort() [node]
 2: 0xa1855c node::OnFatalError(char const*, char const*) [node]
 3: 0xb9638a v8::Utils::ReportApiFailure(char const*, char const*) [node]
 4: 0xb9792e v8::HandleScope::HandleScope(v8::Isolate*) [node]
 5: 0xacd89e node::worker::Worker::JoinThread() [node]
 6: 0x9baed0 node::Environment::stop_sub_worker_contexts() [node]
 7: 0x9bafe0 node::Environment::Stop() [node]
 8: 0x9e5e79 node::Stop(node::Environment*) [node]
 9: 0xad2926 node::worker::Worker::StopThread(v8::FunctionCallbackInfo<v8::Value> const&) [node]
10: 0xc02529  [node]
11: 0xc04317 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
12: 0x1409e59  [node]
Aborted (core dumped)
npm ERR! Test failed.  See above for more details.
Waiting for the debugger to disconnect...

Here is my code from the master (main thread):

try {
            let battleMessage = await new Promise(async (resolve, reject) => {
                const ws = new WSClient(WebSocket, `${mockurl.replace('http', 'ws')}/ws`, ['BattleCompleted', 'BattleCreated']);

                ws.on('msg', async (message) => {
                    console.log(`WS Message:`, message);
                    if (message.subject === 'BattleCompleted') {
                       resolve(message);
                    }
                });

                playerThread1 = await spawn(new Worker(`../scripts/player-setup`));
                playerThread2 = await spawn(new Worker(`../scripts/player-setup`));

                Thread.events(playerThread1).subscribe(event => {
                    console.log(`playerThread1 event: ${JSON.stringify(event,null,4)}`);
                    switch (event.type) {
                    case 'message':
                        if (event.data.type === 'result' && event.data.complete === true) {
                            console.log(`playerThread1 has completed processing`);
                        }
                        else if(event.data.type === 'uncaughtError') console.error(`Error from playerThread1: ${JSON.stringify(event.data.error,null,4)}`);
                        break;
                    default:
                        break;
    
    
                    }
                });
                Thread.events(playerThread2).subscribe(event => {
                    console.log(`playerThread2 event: ${JSON.stringify(event,null,4)}`);
                    switch (event.type) {
                    case 'message':
                        if (event.data.type === 'result' && event.data.complete === true) {
                            console.log(`playerThread2 has completed processing`);
                        }
                        else if(event.data.type === 'uncaughtError') console.error(`Error from playerThread2: ${JSON.stringify(event.data.error,null,4)}`);
                        break;
                    default:
                        break;
    
    
                    }
                }); 
                
                let player1 = await playerThread1.executePvPBattle(Loaded.location, { playerName: 'SuperElf' });
                let player2 = await playerThread2.executePvPBattle(Loaded.location, {
                    playerName: 'Frodo'
                });                
            });

            let winner = JSON.parse(battleMessage.payload);
            console.log(`Winner: ${winner}`);

        }
        catch (e) {
            console.error(`Error received ${e}`);
        }
        finally {
            await Thread.terminate(playerThread1);
            await Thread.terminate(playerThread2);
        }

Everything works normally and I get the console.log for the winner. However, the moment it tries to execute await Thread.terminate(playerThread1); I get the V8 API error.

@andywer
Copy link
Owner

andywer commented Jan 10, 2021

Hey @ashishchandr70, thanks for reporting!

What version of node.js and threads.js are you running?

@andywer andywer added the bug label Jan 10, 2021
@ashishchandr70
Copy link
Author

Hi @andywer Thanks for getting back on a Sunday!

Node: v12.20.0
Threads: 1.6.3

@cameronbraid
Copy link

I get the same error

node 14.15.4

FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
 1: 0xa04200 node::Abort() [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 2: 0x94e4e9 node::FatalError(char const*, char const*) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 3: 0xb794da v8::Utils::ReportApiFailure(char const*, char const*) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 4: 0xb7ab6c v8::HandleScope::HandleScope(v8::Isolate*) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 5: 0x96a439 node::InternalCallbackScope::InternalCallbackScope(node::Environment*, v8::Local<v8::Object>, node::async_context const&, int) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 6: 0x96b172 node::InternalMakeCallback(node::Environment*, v8::Local<v8::Object>, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 7: 0x96b51c node::MakeCallback(v8::Isolate*, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
 8: 0x7f15c6890903  [/media/workspace/drivenow-frontend/node_modules/grpc/src/node/extension_binary/node-v83-linux-x64-glibc/grpc_node.node]
 9: 0x7f15c689b3f3  [/media/workspace/drivenow-frontend/node_modules/grpc/src/node/extension_binary/node-v83-linux-x64-glibc/grpc_node.node]
10: 0x13838b9  [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
11: 0x137c5d8 uv_run [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
12: 0xa44974 node::NodeMainInstance::Run() [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
13: 0x9d1e15 node::Start(int, char**) [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
14: 0x7f15cdb910b3 __libc_start_main [/lib/x86_64-linux-gnu/libc.so.6]
15: 0x9694cc  [/home/cameronbraid/.nvm/versions/node/v14.15.4/bin/node]
Aborted (core dumped)

node 12.18.2

FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
 1: 0xa0bb60 node::Abort() [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 2: 0xa0bf6c node::OnFatalError(char const*, char const*) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 3: 0xb81eca v8::Utils::ReportApiFailure(char const*, char const*) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 4: 0xb8346e v8::HandleScope::HandleScope(v8::Isolate*) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 5: 0x97a9ee node::InternalCallbackScope::InternalCallbackScope(node::Environment*, v8::Local<v8::Object>, node::async_context const&, int) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 6: 0x97c065 node::MakeCallback(v8::Isolate*, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
 7: 0x7fa1b18798a3  [/media/workspace/drivenow-frontend/node_modules/grpc/src/node/extension_binary/node-v72-linux-x64-glibc/grpc_node.node]
 8: 0x7fa1b1884213  [/media/workspace/drivenow-frontend/node_modules/grpc/src/node/extension_binary/node-v72-linux-x64-glibc/grpc_node.node]
 9: 0x133b24c  [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
10: 0x1333dd8 uv_run [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
11: 0xa4ecf5 node::NodeMainInstance::Run() [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
12: 0x9dcc68 node::Start(int, char**) [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
13: 0x7fa1b48180b3 __libc_start_main [/lib/x86_64-linux-gnu/libc.so.6]
14: 0x979215  [/home/cameronbraid/.nvm/versions/node/v12.18.2/bin/node]
Aborted (core dumped)

@andywer
Copy link
Owner

andywer commented Feb 4, 2021

Looks to me as if the native code in your grpc package is not comfortable being run in a worker thread.

Forcing the use of tiny-worker instead of worker threads might solve it, but unfortunately #290 is not merged yet…

@cameronbraid
Copy link

I tried a proof of concept with worker_threads directly and got the same error, so tiny may be the route to go.

Is there any way to force tiny to be used, like somehow making the worker_threads module not resolve ?

@andywer
Copy link
Owner

andywer commented Feb 6, 2021

like somehow making the worker_threads module not resolve ?

Would be pretty hacky, but making the import from worker_threads throw would probably work if tiny-worker is installed.

@cameronbraid
Copy link

I had a go ad trying to alias the module with babel, didn't get it working in the end.

I'll just wait for that PR to land

thanks

Cameron

@ashishchandr70
Copy link
Author

Closing this as all associated PRs have been addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants