You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While developing the Soundness implementation of Easy Racer I observed behaviors that seems to suggest that the loser task(s) aren't being cancelled (or that cancellation isn't being propagated properly).
HTTP connections not closing for scenarios 1, 6, and 7 loser tasks
When running all scenarios (omitting scenarios 3 and 10 to reduce log verbosity), the following is behaved on the scenario server (paying attention to the last three lines):
Started server on port: 8080 (debug=true)
scenario=1 Num concurrent connections = 1
scenario=1 Num concurrent connections = 2
scenario=1 Num concurrent connections = 1 <<< should be 0
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 2
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 2
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 0
scenario=4 Num concurrent connections = 1
scenario=4 Num concurrent connections = 2
scenario=4 Num concurrent connections = 1
scenario=4 Num concurrent connections = 0
scenario=5 Num concurrent connections = 1
scenario=5 Num concurrent connections = 2
scenario=5 Num concurrent connections = 1
scenario=5 Num concurrent connections = 0
scenario=6 Num concurrent connections = 1
scenario=6 Num concurrent connections = 2
scenario=6 Num concurrent connections = 3
scenario=6 Num concurrent connections = 2
scenario=6 Num concurrent connections = 1 <<< should be 0
scenario=7 Num concurrent connections = 1
scenario=7 Num concurrent connections = 2
scenario=7 Num concurrent connections = 1 <<< should be 0
scenario=8 Num concurrent connections = 1
scenario=8 Num concurrent connections = 2
scenario=8 Num concurrent connections = 1
scenario=8 Num concurrent connections = 0
scenario=9 Num concurrent connections = 1
scenario=9 Num concurrent connections = 2
scenario=9 Num concurrent connections = 3
scenario=9 Num concurrent connections = 4
scenario=9 Num concurrent connections = 5
scenario=9 Num concurrent connections = 6
scenario=9 Num concurrent connections = 7
scenario=9 Num concurrent connections = 8
scenario=9 Num concurrent connections = 9
scenario=9 Num concurrent connections = 10
scenario=9 Num concurrent connections = 9
scenario=9 Num concurrent connections = 8
scenario=9 Num concurrent connections = 7
scenario=9 Num concurrent connections = 6
scenario=9 Num concurrent connections = 5
scenario=9 Num concurrent connections = 4
scenario=9 Num concurrent connections = 3
scenario=9 Num concurrent connections = 2
scenario=9 Num concurrent connections = 1
scenario=9 Num concurrent connections = 0
scenario=11 Num concurrent connections = 1
scenario=11 Num concurrent connections = 2
scenario=11 Num concurrent connections = 3
scenario=11 Num concurrent connections = 2
scenario=11 Num concurrent connections = 1
scenario=11 Num concurrent connections = 0
scenario=11 Num concurrent connections = 1
scenario=11 Num concurrent connections = 0
scenario=6 Num concurrent connections = 0
scenario=1 Num concurrent connections = 0
scenario=7 Num concurrent connections = 0
Observe that for scenarios 1, 6, and 7, one connection remains open until the very end (when the program terminates). Contrast this to one of the other implementations, where this is what you see in the server logs:
Started server on port: 8080 (debug=true)
scenario=1 Num concurrent connections = 1
scenario=1 Num concurrent connections = 2
scenario=1 Num concurrent connections = 1
scenario=1 Num concurrent connections = 0
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 2
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 2
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 2
scenario=2 Num concurrent connections = 1
scenario=2 Num concurrent connections = 0
scenario=4 Num concurrent connections = 1
scenario=4 Num concurrent connections = 2
scenario=4 Num concurrent connections = 1
scenario=4 Num concurrent connections = 0
scenario=5 Num concurrent connections = 1
scenario=5 Num concurrent connections = 2
scenario=5 Num concurrent connections = 1
scenario=5 Num concurrent connections = 0
scenario=6 Num concurrent connections = 1
scenario=6 Num concurrent connections = 2
scenario=6 Num concurrent connections = 3
scenario=6 Num concurrent connections = 2
scenario=6 Num concurrent connections = 1
scenario=6 Num concurrent connections = 0
scenario=7 Num concurrent connections = 1
scenario=7 Num concurrent connections = 2
scenario=7 Num concurrent connections = 1
scenario=7 Num concurrent connections = 0
scenario=8 Num concurrent connections = 1
scenario=8 Num concurrent connections = 2
scenario=8 Num concurrent connections = 1
scenario=8 Num concurrent connections = 0
scenario=9 Num concurrent connections = 1
scenario=9 Num concurrent connections = 2
scenario=9 Num concurrent connections = 3
scenario=9 Num concurrent connections = 4
scenario=9 Num concurrent connections = 5
scenario=9 Num concurrent connections = 6
scenario=9 Num concurrent connections = 7
scenario=9 Num concurrent connections = 8
scenario=9 Num concurrent connections = 9
scenario=9 Num concurrent connections = 10
scenario=9 Num concurrent connections = 9
scenario=9 Num concurrent connections = 8
scenario=9 Num concurrent connections = 7
scenario=9 Num concurrent connections = 6
scenario=9 Num concurrent connections = 5
scenario=9 Num concurrent connections = 4
scenario=9 Num concurrent connections = 3
scenario=9 Num concurrent connections = 2
scenario=9 Num concurrent connections = 1
scenario=9 Num concurrent connections = 0
scenario=11 Num concurrent connections = 1
scenario=11 Num concurrent connections = 2
scenario=11 Num concurrent connections = 3
scenario=11 Num concurrent connections = 2
scenario=11 Num concurrent connections = 1
scenario=11 Num concurrent connections = 0
The text was updated successfully, but these errors were encountered:
Thank you for the clear report! I'm not sure why that is, but I'll investigate. (I added a couple of annotations to the original report to make the issue clearer.)
While developing the Soundness implementation of Easy Racer I observed behaviors that seems to suggest that the loser task(s) aren't being cancelled (or that cancellation isn't being propagated properly).
HTTP connections not closing for scenarios 1, 6, and 7 loser tasks
When running all scenarios (omitting scenarios 3 and 10 to reduce log verbosity), the following is behaved on the scenario server (paying attention to the last three lines):
Observe that for scenarios 1, 6, and 7, one connection remains open until the very end (when the program terminates). Contrast this to one of the other implementations, where this is what you see in the server logs:
The text was updated successfully, but these errors were encountered: