Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

terminal proxy #14

Open
crgimenes opened this issue Dec 2, 2023 · 5 comments
Open

terminal proxy #14

crgimenes opened this issue Dec 2, 2023 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@crgimenes
Copy link
Owner

The idea is to have another websocket that receives a stream from a terminal and simply relays it to everyone connected.

Consider separating the application into two parts, one being just the relay and the other the transmitter. The transmitter captures data from the local terminal to the relay, which forwards it to all connected clients.

This way, anyone with the transmitter and the API key can send the terminal to the relay.

Requirements:

  • The transmission is protected by an API key.
  • The use of websocket and not another communication layer is to allow it to easily pass through firewalls.
@crgimenes crgimenes self-assigned this Dec 2, 2023
@crgimenes crgimenes added the enhancement New feature or request label Dec 2, 2023
@CarlosLint
Copy link

I'm a bit puzzled here, say we have 3 systems on different IP addresses: server.compterm, proxy.compterm and client.someISPfaraway.

Server.compterm spawns /bin/bash shared over compterm.
Proxy.compterm runs a compterm server that spawns a CLI client which is shared over its own compterm server process.
Client runs compterm-client (over web or CLI, however the end users feels like).

Supposing that we have a fully working compterm TUI/CLI client, is there anything that would stop the show?

@crgimenes
Copy link
Owner Author

I did not understand your question.
The current version already works; this is just a new feature.

I want to start an instance of compterm in a VSP; I will configure this instance not to launch a shell. Instead, it will receive the shell's output through a port and relay it to the connected clients.

Another instance of compterm running on a local machine will load the shell and send the output to the instance running on the VSP instead of itself receiving the connection from clients and relaying the shell output.

This feature does not replace what we already have in the current version. It's just a better way for those who don't have a home server, their ISP doesn't allow incoming connections, they're behind a firewall, or they don't want to expose their internet connection directly.

This new feature also allows you to use the client you prefer. You can use the web client, CLI, or any other. But instead of connecting directly with the machine running the shell, you will connect with this instance that is doing relay.

@CarlosLint
Copy link

Goes like this: Imagine we have a program which is a compterm client using console/text/CLI interface. (We do have such client, don't we? Not sure if it's working/not)

Let's say such program is "client.go" and resides atop of compterm's directory and takes an IP/hostname as the single required argument, ok?

Couldn't we just run go run main.go go run client.go server.compterm.org and get a working compterm proxy?

@crgimenes
Copy link
Owner Author

At the moment you can't.
Compterm does not have an input port at the moment, it only relays the executable that it itself loaded and captured stdin/out.

@CarlosLint
Copy link

Tbh most people won't use duplex communication from what I can tell, and as long as all clients connected to the proxy are read-only why would the proxy need to send its input?

Either way, imho it would be best to implement authentication (even better if we could do it via command line arguments in the client to help it act as proxy) + duplex communication and the relay stuff will most likely solve itself.

That's just my 2 cents, if anyone thinks it's still a good idea to chase I'm leaving the space here open for further commenting. Ciao.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants