Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]std::thread crash while using dlopen/dlclose set APP_STL := c++_static #936

Closed
gejianqiang opened this issue Mar 20, 2019 · 6 comments

Comments

@gejianqiang
Copy link

gejianqiang commented Mar 20, 2019

  • NDK Version: all NDK with APP_STL :=c++_static , tested from NDK 13,
  • Build system: ndk-build
  • Host OS: Windows10 64bit
  • ABI:arm64-v8a
  • NDK API level: APP_PLATFORM := android-24
  • Device API level: not set

run this demo on android device with current O or P version,
if set APP_STL := c++_static, crash occurred when outside loop i = 128,
if set APP_STL := gnustl_static, there is no crash.
but after NDK r18, there is only c++_static. I have to use NDK R17 with gnustl_static

test demo and so, the flow is:
`demo:
main()
{
for(i=0; i<1000; i++)
{
dlopen so
for(int j=0; j<1; j++)
{
call process_func() in so
}
dlclose so
}
}

so:
process_func()
{
call std::thread and join
}`

@enh
Copy link
Contributor

enh commented Mar 20, 2019

we'll be able to look at this much faster if you (a) provide an actual reproduceable test case and (b) include the specific crash from logcat.

@DanAlbert
Copy link
Member

DanAlbert commented Mar 20, 2019

FWIW you could almost certainly solve your problem by not calling dlclose. dlclose is almost never a good idea and causes lots of problems.

@DanAlbert
Copy link
Member

You might also be interested in reading through #360, but since you say your test fails on P I'm guessing that isn't the issue.

@gejianqiang
Copy link
Author

FWIW you could almost certainly solve your problem by not calling dlclose. dlclose is almost never a good idea and causes lots of problems.

It is impossible by not calling dlclose or use RTLD_NODELETE, the daemon is as a server run at android HAL, it dlopen so on demand, there are hundreds of so.

Please raise the priority, Thanks.

@gejianqiang
Copy link
Author

we'll be able to look at this much faster if you (a) provide an actual reproduceable test case and (b) include the specific crash from logcat.

(a) sorry, I am inconvenient to send code because of some restrictions, see the demo flow,
dlopen("xxx.so", RTLD_NOW | RTLD_LOCAL); // RTLD_GLOBAL or RTLD_NODELETE is no crash

(b) in demo command, info is "cannot create thread specific key for __cxa_get_globals()
Aborted"
logcat is :
03-12 04:15:36.320 20316 20836 E libc++abi: cannot create thread specific key for __cxa_get_globals()
03-12 04:15:36.321 20316 20836 F libc : Fatal signal 6 (SIGABRT), code -6 (SI_TKILL) in tid 20836 (gjqtest_demo), pid 20316 (gjqtest_demo)
03-12 04:15:36.335 20839 20839 I crash_dump64: obtaining output fd from tombstoned, type: kDebuggerdTombstone
03-12 04:15:36.336 785 785 I /system/bin/tombstoned: received crash request for pid 20836
03-12 04:15:36.337 20839 20839 I crash_dump64: performing dump of process 20316 (target tid = 20836)
03-12 04:15:36.338 20839 20839 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
03-12 04:15:36.338 20839 20839 F DEBUG : Build fingerprint: 'kirin/kirin/kirin:9/PPR1.180610.011/24114:eng/test-keys'
03-12 04:15:36.338 20839 20839 F DEBUG : Revision: '0'
03-12 04:15:36.338 20839 20839 F DEBUG : ABI: 'arm64'
03-12 04:15:36.338 20839 20839 F DEBUG : pid: 20316, tid: 20836, name: gjqtest_demo >>> ./gjqtest_demo <<<
03-12 04:15:36.338 20839 20839 F DEBUG : signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
03-12 04:15:36.338 20839 20839 F DEBUG : Abort message: 'cannot create thread specific key for __cxa_get_globals()'
03-12 04:15:36.338 20839 20839 F DEBUG : x0 0000000000000000 x1 0000000000005164 x2 0000000000000006 x3 0000000000000008
03-12 04:15:36.338 20839 20839 F DEBUG : x4 fefefefefefefeff x5 fefefefefefefeff x6 fefefefefefefeff x7 7f7f7f7f7f7f7f7f
03-12 04:15:36.338 20839 20839 F DEBUG : x8 0000000000000083 x9 5b58cc1e681d328c x10 0000000000000000 x11 fffffffc7fffffdf
03-12 04:15:36.338 20839 20839 F DEBUG : x12 0000000000000001 x13 000000005c873268 x14 0004b7e2abae49de x15 0000103f098aa005
03-12 04:15:36.338 20839 20839 F DEBUG : x16 00000079516322e0 x17 0000007951570cf8 x18 0000000000000001 x19 0000000000004f5c
03-12 04:15:36.338 20839 20839 F DEBUG : x20 0000000000005164 x21 000000795068d2d8 x22 00000000ffffff80 x23 00000000ffffffc8
03-12 04:15:36.338 20839 20839 F DEBUG : x24 000000795068d3a0 x25 000000795068d270 x26 000000795068d2b0 x27 0000000000000016
03-12 04:15:36.338 20839 20839 F DEBUG : x28 0000007fd9c7b5a4 x29 000000795068d1e0
03-12 04:15:36.338 20839 20839 F DEBUG : sp 000000795068d1a0 lr 0000007951565824 pc 000000795156584c
03-12 04:15:36.339 20839 20839 I unwind : Malformed section header found, ignoring...
03-12 04:15:36.340 20839 20839 F DEBUG :
03-12 04:15:36.340 20839 20839 F DEBUG : backtrace:
03-12 04:15:36.340 20839 20839 F DEBUG : #00 pc 000000000002284c /system/lib64/libc.so (abort+116)
03-12 04:15:36.340 20839 20839 F DEBUG : #1 pc 00000000000162b8 /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #2 pc 0000000000012fd4 /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #3 pc 0000000000083e54 /system/lib64/libc.so (pthread_once+148)
03-12 04:15:36.340 20839 20839 F DEBUG : #4 pc 0000000000012ef4 /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #5 pc 000000000001287c /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #6 pc 000000000000f664 /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #7 pc 000000000000fa5c /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #8 pc 000000000000e678 /data/gjqtest/libgjqtest.so
03-12 04:15:36.340 20839 20839 F DEBUG : #9 pc 00000000000827f0 /system/lib64/libc.so (__pthread_start(void*)+36)
03-12 04:15:36.340 20839 20839 F DEBUG : #10 pc 00000000000240a0 /system/lib64/libc.so (__start_thread+68)

@DanAlbert
Copy link
Member

#789

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants