Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: proxy/preview server #231

Merged
merged 1 commit into from
Jan 12, 2022
Merged

refactor: proxy/preview server #231

merged 1 commit into from
Jan 12, 2022

Conversation

threepointone
Copy link
Contributor

@threepointone threepointone commented Jan 11, 2022

Closes #205.

This PR refactors how we setup the proxy server between the developer and the edge preview service during wrangler dev. Of note, we start the server immediately. We also buffers requests/streams and hold on to them when starting/refreshing the token. This means a developer should never see ERR_CONNECTION_REFUSED error page, or have an older worker respond after making a change to the code. And when the token does get refreshed, we flush said streams/requests with the newer values, making the iteration process a lot smoother and predictable.

@changeset-bot
Copy link

changeset-bot bot commented Jan 11, 2022

🦋 Changeset detected

Latest commit: c3ea521

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
wrangler Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link
Contributor

@petebacondarwin petebacondarwin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK so I found it quite hard to follow what had changed and what was going on - mainly due to my lack of experience in this part of the code base. But I think it looks like it should work.


refactor: proxy/preview server

This PR refactors how we setup the proxy server between the developer and the edge preview service during `wrangler dev`. Of note, we start the server immediately. We also buffers requests/streams and hold on to the, when starting/refreshing the token. This means a developer should never see `ERR_CONNECTION_REFUSED` error page, or have an older worker respond after making a change to the code. And when the token does get refreshed, we flush said streams/requests with the newer values, making the iteration process a lot smoother and predictable.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This PR refactors how we setup the proxy server between the developer and the edge preview service during `wrangler dev`. Of note, we start the server immediately. We also buffers requests/streams and hold on to the, when starting/refreshing the token. This means a developer should never see `ERR_CONNECTION_REFUSED` error page, or have an older worker respond after making a change to the code. And when the token does get refreshed, we flush said streams/requests with the newer values, making the iteration process a lot smoother and predictable.
This PR refactors how we setup the proxy server between the developer and the edge preview service during `wrangler dev`. Of note, we start the server immediately. We also buffer requests/streams and hold on to them, when starting/refreshing the token. This means a developer should never see `ERR_CONNECTION_REFUSED` error page, or have an older worker respond after making a change to the code. And when the token does get refreshed, we flush said streams/requests with the newer values, making the iteration process a lot smoother and predictable.


/**
* Creates a HTTP/1 proxy that sends requests over HTTP/2.
* `usePreviewServer` is a React hook that creates a local development
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When you say "local" do you really mean "on the local machine" (e.g. Miniflare)?
Or is this preview server actually a real Worker running at the Edge?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By the way, this comment seems in the wrong place. Shouldn't it be directly above the usePreview() function below?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's both - the worker we upload gets exposed from the, and we start up a local development server on our own machine that acts as a proxy to it. In local mode, it proxies to miniflare.

host: string,
port: number
) {
for (const [name, value] of Object.entries(headers)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do prefer this approach to iteration over using .forEach() etc. Nice!

Comment on lines 54 to 55
host: string,
port: number
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT - consider renaming these for clarity: host -> remoteHost, port -> localPort.

proxy.on("request", bufferRequestResponse);
cleanupListeners.push(() => proxy.off("request", bufferRequestResponse));
return () => {
cleanupListeners.forEach((cleanup) => cleanup());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to worry about memory leaks here?
Would it be safer to also empty the cleanupListeners array?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking this out aloud - Every reference inside the effect is dereferenced after this function is called, so I don't think clearing the array would help any more. Specifically, cleanupListeners itself is dreferenced. If any of the listeners itself is caught hanging on to something, then wiping this array wouldn't help anyway.

Comment on lines 225 to 256
if (assetPath) {
const handleAsset = serveStatic(assetPath, {
cacheControl: false,
});
const handleAssetsAndRequests = (
request: IncomingMessage,
response: ServerResponse
) => {
handleAsset(request, response, () => {
handleRequest(request, response);
});
};

proxy.on("request", handleAssetsAndRequests);
cleanupListeners.push(() =>
proxy.off("request", handleAssetsAndRequests)
);

// flush and replay buffered requests
requestResponseBufferRef.current.forEach(({ request, response }) =>
handleAssetsAndRequests(request, response)
);
requestResponseBufferRef.current = [];
} else {
proxy.on("request", handleRequest);
cleanupListeners.push(() => proxy.off("request", handleRequest));
// flush and replay buffered requests
requestResponseBufferRef.current.forEach(({ request, response }) =>
handleRequest(request, response)
);
requestResponseBufferRef.current = [];
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT - this whole block could be simplified to remove some of the duplication - for example:

const actualHandleRequest = assetPath ?
    createHandleAssetsRequest(assetPath, handleRequest) :
    handleRequest;

proxy.on("request", actualHandleRequest);
cleanupListeners.push(() => proxy.off("request", actualHandleRequest));

// flush and replay buffered requests
requestResponseBufferRef.current.forEach(({ request, response }) =>
  actualHandleRequest(request, response)
);
requestResponseBufferRef.current = [];

// ... somewhere else

function createHandleAssetsRequest(assetPath: string, handleRequest: RequestListener) {
  const handleAsset = serveStatic(assetPath, {
    cacheControl: false,
  });
  return (
    request: IncomingMessage,
    response: ServerResponse
  ) => {
    handleAsset(request, response, () => {
      handleRequest(request, response);
    };
  };
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this!

addCfPreviewTokenHeader(headers, previewToken.value);
headers["host"] = previewToken.host;
const localWebsocket = new WebSocket(message, socket, body);
// TODO(soon): Custom WebSocket protocol is not working?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this where we need to handshake over the "Subprotocol negotiation" of websocket that is being requested?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm actually not sure, this cod eis from the previous impl.

Closes #205

This PR refactors how we setup the proxy server between the developer and the edge preview service during `wrangler dev`. Of note, we start the server immediately. We also buffers requests/streams and hold on to the, when starting/refreshing the token. This means a developer should never see `ERR_CONNECTION_REFUSED` error page, or have an older worker respond after making a change to the code. When the token does get refreshed, we flush said streams/requests with the newer values.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

enhance: server proxy
2 participants