Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Catch IOException when Apalache server binds in-use address #2676

Closed
thpani opened this issue Jul 28, 2023 · 4 comments · Fixed by #2811
Closed

Catch IOException when Apalache server binds in-use address #2676

thpani opened this issue Jul 28, 2023 · 4 comments · Fixed by #2811
Labels
usability UX improvements

Comments

@thpani
Copy link
Contributor

thpani commented Jul 28, 2023

Binding an in-use address (viz. port) from Shai / Apalache server results in a lengthy error trace.

We should catch and handle the exception.

Example stacktrace
~/s/a/unstable ❯❯❯ apalache-mc server --write-intermediate=true
10:38:56.015 [main] INFO at.forsyte.apalache.tla.tooling.opt.ServerCmd -- Loading configuration
# Usage statistics is OFF. We care about your privacy.
# If you want to help our project, consider enabling statistics with config --enable-stats=true.

Output directory: /Users/thomas/src/apalache/unstable/_apalache-out/server/2023-07-28T10-38-56_9205152170509149665
# APALACHE version: v0.41.2-4-g561d08f38 | build: v0.41.2-4-g561d08f38 I@10:38:56.201
Starting server on port 8822...                                   I@10:38:56.207
The Apalache server is running on port 8822. Press Ctrl-C to stop.
Fiber failed.
A checked error was not handled.
java.io.IOException: Failed to bind to address 0.0.0.0/0.0.0.0:8822
	at io.grpc.netty.NettyServer.start(NettyServer.java:328)
	at io.grpc.internal.ServerImpl.start(ServerImpl.java:185)
	at io.grpc.internal.ServerImpl.start(ServerImpl.java:94)
	at scalapb.zio_grpc.Server$ServiceImpl.$anonfun$start$1(Server.scala:44)
	at zio.internal.FiberContext.evaluateNow(FiberContext.scala:365)
	at zio.internal.FiberContext.$anonfun$fork$15(FiberContext.scala:779)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.net.BindException: Address already in use
	at java.base/sun.nio.ch.Net.bind0(Native Method)
	at java.base/sun.nio.ch.Net.bind(Net.java:555)
	at java.base/sun.nio.ch.ServerSocketChannelImpl.netBind(ServerSocketChannelImpl.java:337)
	at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:294)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:141)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:600)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:579)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
	at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	... 1 more

Fiber:Id(1690533536298,1) was supposed to continue to:
  a future continuation at zio.ZIO.unit(ZIO.scala:2151)
  a future continuation at <unknown>.<unknown>(Server.scala:0)
  a future continuation at zio.ZIO.run(ZIO.scala:1764)
  a future continuation at zio.ZIO.bracket_(ZIO.scala:291)
  a future continuation at zio.ZManaged$.makeExit(ZManaged.scala:2083)
  a future continuation at zio.ZManaged.map(ZManaged.scala:484)
  a future continuation at zio.ZManaged.flatMap(ZManaged.scala:337)
  a future continuation at zio.ZManaged.map(ZManaged.scala:484)
  a future continuation at zio.ZManaged.map(ZManaged.scala:484)
  a future continuation at zio.ZManaged.flatMap(ZManaged.scala:337)
  a future continuation at zio.ZManaged.flatMap(ZManaged.scala:337)
  a future continuation at zio.ZIO.run(ZIO.scala:1764)
  a future continuation at zio.ZIO.bracket_(ZIO.scala:291)
  a future continuation at zio.ZManaged.use(ZManaged.scala:1089)
  a future continuation at zio.ZIO.run(ZIO.scala:1764)
  a future continuation at zio.ZManaged.use(ZManaged.scala:1088)
  a future continuation at zio.ZIO.exitCode(ZIO.scala:626)

Fiber:Id(1690533536298,1) execution trace:
  at scalapb.zio_grpc.Server$ServiceImpl.start(Server.scala:44)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:559)
  at zio.ZManaged$.makeExit(ZManaged.scala:2082)
  at zio.ZIO$AccessPartiallyApplied$.apply(ZIO.scala:4470)
  at zio.ZManaged.flatMap(ZManaged.scala:336)
  at zio.ZManaged.map(ZManaged.scala:484)
  at zio.ZManaged.flatMap(ZManaged.scala:337)
  at zio.ZManaged.map(ZManaged.scala:484)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:160)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO$.bracketExit(ZIO.scala:2437)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:563)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.ZIO.run(ZIO.scala:1764)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZManaged.mapM(ZManaged.scala:517)
  at at.forsyte.apalache.shai.v1.cmdExecutor.ZioCmdExecutor$ZCmdExecutor$$anon$3.bind(ZioCmdExecutor.scala:27)
  at zio.ZIO$.runtime(ZIO.scala:4000)
  at zio.ZIO$.executor(ZIO.scala:2879)
  at zio.ZIO$.runtime(ZIO.scala:3999)
  at zio.ZIO$.runtime(ZIO.scala:3998)
  at zio.ZIO$.runtime(ZIO.scala:3998)
  at zio.ZIO$AccessPartiallyApplied$.apply(ZIO.scala:4470)
  at scalapb.zio_grpc.ZBindableService$$anon$1.bindService(ZBindableService.scala:31)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:559)
  at zio.ZIO.provideSome(ZIO.scala:1266)
  at zio.ZManaged.mapM(ZManaged.scala:517)
  at zio.ZManaged$.fromEffect(ZManaged.scala:1883)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:160)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO$.bracketExit(ZIO.scala:2437)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:563)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.ZIO.run(ZIO.scala:1764)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at at.forsyte.apalache.shai.v1.RpcServer.createCmdExecutorService(RpcServer.scala:28)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:559)
  at zio.ZIO.provideSome(ZIO.scala:1266)
  at zio.ZManaged.flatMap(ZManaged.scala:336)
  at zio.ZManaged.flatMap(ZManaged.scala:337)
  at zio.ZManaged.map(ZManaged.scala:484)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:160)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO$.bracketExit(ZIO.scala:2437)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:563)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.ZIO.run(ZIO.scala:1764)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZManaged.mapM(ZManaged.scala:517)
  at at.forsyte.apalache.shai.v1.transExplorer.ZioTransExplorer$ZTransExplorer$$anon$3.bind(ZioTransExplorer.scala:29)
  at zio.ZIO$.runtime(ZIO.scala:4000)
  at zio.ZIO$.executor(ZIO.scala:2879)
  at zio.ZIO$.runtime(ZIO.scala:3999)
  at zio.ZIO$.runtime(ZIO.scala:3998)
  at zio.ZIO$.runtime(ZIO.scala:3998)
  at zio.ZIO$AccessPartiallyApplied$.apply(ZIO.scala:4470)
  at scalapb.zio_grpc.ZBindableService$$anon$1.bindService(ZBindableService.scala:31)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:559)
  at zio.ZIO.provideSome(ZIO.scala:1266)
  at zio.ZManaged.mapM(ZManaged.scala:517)
  at zio.ZManaged$.fromEffect(ZManaged.scala:1883)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:160)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO$.bracketExit(ZIO.scala:2437)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:563)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.ZIO.run(ZIO.scala:1764)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at at.forsyte.apalache.shai.v1.RpcServer.createTransExplorerService(RpcServer.scala:26)
  at zio.ZRef$.make(ZRef.scala:756)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO.bracket_(ZIO.scala:291)
  at zio.internal.FiberContext.evaluateNow(FiberContext.scala:559)
  at zio.ZIO.provideSome(ZIO.scala:1266)
  at zio.ZManaged.flatMap(ZManaged.scala:336)
  at zio.ZManaged$.succeed(ZManaged.scala:2497)
  at zio.ZManaged.flatMap(ZManaged.scala:336)
  at zio.ZManaged$.succeed(ZManaged.scala:2497)
  at zio.ZManaged.flatMap(ZManaged.scala:336)
  at zio.ZManaged$.fromEffect(ZManaged.scala:1883)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:160)
  at zio.internal.FiberContext$InterruptExit$.apply(FiberContext.scala:153)
  at zio.ZIO$.effectSuspendTotal(ZIO.scala:2837)
  at zio.ZIO$.bracketExit(ZIO.scala:2437)

Fiber:Id(1690533536298,1) was spawned by:

Fiber:Id(1690533536237,0) was supposed to continue to:
  a future continuation at zio.App.main(App.scala:60)
  a future continuation at zio.App.main(App.scala:59)

Fiber:Id(1690533536237,0) ZIO Execution trace: <empty trace>

Fiber:Id(1690533536237,0) was spawned by: <empty trace>
@thpani thpani added usability UX improvements effort-easy Can be completed within about 1 day labels Jul 28, 2023
@thpani thpani self-assigned this Jul 28, 2023
@thpani thpani removed their assignment Aug 8, 2023
@thpani
Copy link
Contributor Author

thpani commented Aug 8, 2023

I spent a bit of time on this, but it seems to fail somewhere inside ZIO.

Giving up for now, because I've hit my self-imposed timebox on this.

@konnov
Copy link
Contributor

konnov commented Aug 24, 2023

I just ran into the same issue. This would be hard even for an experienced user to understand what's going on. Could we get rid at least of the zio stack trace? @shonfeder wdyt?

@thpani thpani removed the effort-easy Can be completed within about 1 day label Oct 17, 2023
@fan-tom
Copy link
Contributor

fan-tom commented Jan 21, 2024

I managed to strip these long stacktraces, preserving only the root cause stacktrace:

19:18:18.959 [main] INFO at.forsyte.apalache.tla.tooling.opt.ServerCmd -- Loading configuration
# Usage statistics is OFF. We care about your privacy.
# If you want to help our project, consider enabling statistics with config --enable-stats=true.

Output directory: /Users/alexey/dev/private/tlaplus/apalache/_apalache-out/server/2024-01-21T19-18-19_8975742003987029663
# APALACHE version: v0.44.2-39-g0e51e022b | build: v0.44.2-39-g0e51e022b I@19:18:19.399
Starting server on port 8822...                                   I@19:18:19.419
The Apalache server is running on port 8822. Press Ctrl-C to stop.
Error while starting Apalache server                              E@19:18:20.540
java.io.IOException: Failed to bind to address 0.0.0.0/0.0.0.0:8822
	at io.grpc.netty.NettyServer.start(NettyServer.java:333)
	at io.grpc.internal.ServerImpl.start(ServerImpl.java:185)
	at io.grpc.internal.ServerImpl.start(ServerImpl.java:94)
	at scalapb.zio_grpc.Server$ServiceImpl.$anonfun$start$1(Server.scala:44)
	at zio.internal.FiberContext.evaluateNow(FiberContext.scala:365)
	at zio.internal.FiberContext.$anonfun$fork$15(FiberContext.scala:779)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1623)
Caused by: java.net.BindException: Address already in use
	at java.base/sun.nio.ch.Net.bind0(Native Method)
	at java.base/sun.nio.ch.Net.bind(Net.java:556)
	at java.base/sun.nio.ch.ServerSocketChannelImpl.netBind(ServerSocketChannelImpl.java:344)
	at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:301)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:141)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:600)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:579)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
	at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	... 1 common frames omitted

Process finished with exit code 1

WDYT, is it enough?

@shonfeder
Copy link
Contributor

Awesome!

Ideally, if we know the failure reason we can avoid any stack trace at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usability UX improvements
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants