-
-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve performance SSR rendering #36
Comments
I had a moment to ponder this and I'm thinking of how to approach "async EEx". This would all need to be done at compile time. It might be necessary to wrap Phoenix's Live View EEx engine. We would need to traverse the generated AST, replacing the SSR-flagged Svelte components with assigns and accumulating them in a module attribute. Prior to rendering, we would send the SSR calls with something like |
So I've been trying to perf this out on the server side and I think there are really only some small optimizations we can do to get a slightly faster response. Here's an example from using a Unix domain socket server to host render requests in Node.js:
The render time inside Node.js is very fast - usually about a tenth of a millisecond. The rest of that time is spent serializing/deserializing the data. But still, 1ms is pretty fast for elixir -> node.js -> elixir. This response time can be improved by using something like NimblePool and checking out sockets rather than having a GenServer manage them My next thought was maybe there are some hidden performance issues with Phoenix rendering. None there either, usually at most 1ms. There are some cases where I've noticed a slower render performance, but I'm getting pretty consistent 2ms for the total server-side render time (that is, the full request to Node.js to request render and returning the final template that gets sent to the browser). I removed the phoenix svg logo and that actually gave us better response times lol. Next thing was to remove I don't fully understand how Let me know if you have any thoughts. We might need to start an external discussion about this because it actually touches a lot of different parts. |
Thank you for investigating. Do you have some code so I can reproduce some of the results? Could be cool to sort of have a waterfall visualization of what's going on in each step. I'm very unfamiliar with all of this, but it looks like actually Node is not so slow, it's mostly the back and forward passing of the generated html/js/css that's taking more time. |
Hi all! The other day, I was experimenting with LiveSvelte I'm trying to get it working under Deno 2 (still a work in progress, since it requires quite a few changes). Below is the behavior that works with a custom defmodule LiveSvelte.SSR.Bun do
@behaviour LiveSvelte.SSR
@prefix "__elixirnodejs__UOSBsDUP6bp9IF5__"
@read_chunk_size 65_536
@default_timeout 10_000
def bun_server_path() do
# The path to your existing "server.js" it by default is in the priv/svelte folder
Application.app_dir(:test_livesvelte, "priv/svelte/server.js")
end
@impl true
def render(component_name, props, slots) do
with {:ok, port} <- ensure_port_started() do
case call_port(port, [component_name, props, slots], @default_timeout) do
{:ok, ssr_map} when is_map(ssr_map) ->
ssr_map
{:ok, nil} ->
nil
{:error, reason} ->
raise "Bun SSR error: #{inspect(reason)}"
end
else
{:error, reason} ->
raise "Bun SSR not configured or could not spawn port. Reason: #{inspect(reason)}"
end
end
defp ensure_port_started() do
path = bun_server_path()
bun = bun_executable()
if not File.exists?(path) do
{:error, "Cannot find Bun script at #{path}"}
else
port =
Port.open({:spawn_executable, bun}, [
{:args, [path]},
{:line, @read_chunk_size},
:use_stdio,
:exit_status,
:stderr_to_stdout
])
{:ok, port}
end
end
defp bun_executable, do: System.find_executable("bun") || "bun"
defp call_port(port, payload, timeout) do
request_line = Jason.encode!(payload) <> "\n"
Port.command(port, request_line)
case receive_response("", port, timeout) do
{:ok, data} ->
{:ok, decode_response(data)}
{:error, :timeout} ->
{:error, :timeout}
{:error, {:exit, 0}} ->
{:ok, nil}
{:error, {:exit, code}} ->
{:error, {:exit, code}}
end
end
defp receive_response(acc, port, timeout) do
receive do
{^port, {:data, {flag, chunk}}} ->
chunk_bin = :erlang.iolist_to_binary(chunk)
new_acc = acc <> chunk_bin
case flag do
:noeol -> receive_response(new_acc, port, timeout)
:eol -> handle_possible_lines(new_acc, port, timeout)
end
{^port, {:exit_status, status}} ->
{:error, {:exit, status}}
after
timeout ->
{:error, :timeout}
end
end
defp handle_possible_lines(acc, port, timeout) do
lines = String.split(acc, "\n", trim: true)
case Enum.split_while(lines, &(!String.starts_with?(&1, @prefix))) do
{_, [prefixed | _]} ->
data = String.replace_prefix(prefixed, @prefix, "")
{:ok, data}
{_, []} ->
# No prefix found in this chunk, keep reading
receive_response(acc <> "\n", port, timeout)
end
end
defp decode_response(json_string) do
case Jason.decode(json_string) do
{:ok, [true, value]} when is_map(value) ->
value
{:ok, [true, _nonmap]} ->
nil
{:ok, [false, error_message]} ->
# The server.js indicated an error
raise "SSR error: #{error_message}"
_ ->
# No valid JSON or shape
nil
end
end
end build.js in folder 'assets/' const esbuild = require("esbuild");
const sveltePlugin = require("esbuild-svelte");
const importGlobPlugin = require("esbuild-plugin-import-glob").default;
const sveltePreprocess = require("svelte-preprocess");
const args = process.argv.slice(2);
const watch = args.includes("--watch");
const deploy = args.includes("--deploy");
let optsClient = {
entryPoints: ["js/app.js"],
bundle: true,
minify: deploy,
target: "es2017",
conditions: ["svelte", "browser"],
outdir: "../priv/static/assets",
logLevel: "info",
sourcemap: watch ? "inline" : false,
tsconfig: "./tsconfig.json",
plugins: [
importGlobPlugin(),
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: {
dev: !deploy,
hydratable: true,
css: "injected",
},
}),
],
};
// Use ES module output for SSR so Bun can run it nicely
let optsServer = {
entryPoints: ["js/server.js"],
platform: "neutral", // or 'node', but 'neutral' is more generic
format: "esm", // produce ESM
target: "es2020", // good modern baseline
bundle: true,
minify: false,
conditions: ["svelte"],
outdir: "../priv/svelte",
logLevel: "info",
sourcemap: watch ? "inline" : false,
tsconfig: "./tsconfig.json",
// The line that helps with apexcharts:
mainFields: ["module", "main", "browser"],
plugins: [
importGlobPlugin(),
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: {
dev: !deploy,
hydratable: true,
generate: "ssr",
},
}),
],
};
if (watch) {
esbuild
.context(optsClient)
.then((ctx) => ctx.watch())
.catch((_error) => process.exit(1));
esbuild
.context(optsServer)
.then((ctx) => ctx.watch())
.catch((_error) => process.exit(1));
} else {
esbuild.build(optsClient).catch((_error) => process.exit(1));
esbuild.build(optsServer).catch((_error) => process.exit(1));
} I have repo with minimal code example Honestly i prefer much more bun instead of nodejs as javascript engine, and even more deno 2. Cheers. |
I made a repo showing how we can generate flamegraphs. Maybe this can help in the future. |
Currently SSR takes about 8ms each render in production, and about 20-40ms in dev, and this is being executed synchronously.
So let's say you have 10 components inside a LiveView, it'll end up taking 80ms!
When turning off SSR it's super fast, you can easily render a 100 components and not feel any delay.
The culprit is the call to NodeJS.
Doing the following things will probably do the trick.
Maybe Bun is the answer here, or a better way of calling Node.
The text was updated successfully, but these errors were encountered: