-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching the list of resources for individual request paths #1
Comments
I've added a method r, _ := os.Open("www/index.html")
uris, err := p.List(r, "www.example.com", "/index.html")
if err != nil {
panic(err)
}
cache["/index.html"] = uris Then you could push them yourself: http.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
if pusher, ok := w.(http.Pusher); ok {
if uris, ok := cache[r.RequestURI]; ok {
for _, uri := range uris {
_ = pusher.Push(uri, nil)
}
}
}
// ...
} Make sure you cache only HTML files for example, so that |
That's a good start -- can we disable parsing of other types of assets so that we can cache everything that we push for (CSS, etc.)? I don't want to cause a failure with the recursive push but CSS files are also common enough that I'd like to cache them too. Or maybe if there's some better way to handle it and not take a perf hit. |
I pushed a big refactor, which makes the distinction clearer between regular parsing and recursive. For example, parse a file and list its URIs: r, _ := os.Open("index.html")
parser, err := push.NewParser("example.com", "/", "/index.html")
if err != nil {
panic(err)
}
uris, err := push.List(parser, r, "text/html")
if err != nil {
panic(err)
} Parse a file, read and parse its resources and list all URIs: r, _ := os.Open("index.html")
parser, err := push.NewRecursiveParser("example.com", "/", push.FileOpenerFunc(func(uri string) (io.Reader, string, error) {
// open file for uri
return r, mimetype, nil
}), "/index.html")
if err != nil {
panic(err)
}
uris, err := push.List(parser, r, "text/html")
if err != nil {
panic(err)
} Maybe I'll abstract away the whole recursive parsing into a type that fullfills the |
Excellent; I'm going to start looking into integrating this with @wendigo's PR over at caddyserver/caddy#1215. Thank you! |
This is great, looking forward to trying it out!
If it works well, we'd like to use it in Caddy.
Before we can do that, though, we need to avoid parsing the content on every request. How would you approach caching with this package? I envision a
map[string][]string
of sorts, where each key is a request path and each value is a list of resources to push for that path. The cache for an entry could be invalidated when the file pointed to by the key changes or after a certain amount of time.Would this be something you could expose an API for or would you build it in?
The text was updated successfully, but these errors were encountered: