Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cold Start and Warm Start on Consumption Plan #131

Closed
justinyoo opened this issue Jan 7, 2017 · 17 comments
Closed

Cold Start and Warm Start on Consumption Plan #131

justinyoo opened this issue Jan 7, 2017 · 17 comments

Comments

@justinyoo
Copy link

Hi, Team.

When I looked into the doco, https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale, it doesn't mention the starting time (or warm-up time) but 5 mins time limit in the comment block. I know a Function app doesn't take more than 5 mins long on Consumption Plan.

Here are my questions on Consumption Plan:

  • Are there differences between cold-start and warm-start in terms of startup time?
  • If so, when either the cold-start or warm-start is happening?

As far as I remember the Twitter conversation with @christopheranderson on the other day correctly, when no function is called on a Functions app with Consumption Plan for more than 5 mins, the Functions app goes into the idle mode. It gives me an implication that a new function call after 5 mins of no activity, ie the Functions app has gone into the idle mode, will do cold-start that would take longer than the warm-start.

In this situation, how much extent does the cold-start take longer than the warm-start, in general? It may depend on how complex the function is, by the way.

Cheers,

@Noradrex
Copy link

Any updates on this?
I'm experiencing 7 seconds delays in Functions (HTTP triggered) bettween the request to actually start processing something. Looks like after less than 30 minutes (I have no more data right now, could be 5 min or more) the functions goes into idle mode.
It makes the service not recommended for some scenarios without additional work (warmup systems) but there is no info about this limitation.

@mikezks
Copy link

mikezks commented Mar 3, 2017

Preventing to change to cold-start mode is quite easy. Just add a time trigger function within the same function app which executes every 5 minutes. But be aware that this may cause additional costs if your free execution credit is exceeded.

@ggailey777
Copy link
Contributor

@mamaso suggested that "we may want to add info about https://github.com/christopheranderson/azure-functions-pack and Azure/azure-functions-host#298" to the Nodejs reference topic.

@perosb
Copy link

perosb commented Sep 25, 2017

This also occur when scaling, all new server are added to "load balancer" cold, making very long requests.
I'm currently doing some tests and a request under no load takes < 200ms for first byte.
image

@perosb
Copy link

perosb commented Sep 27, 2017

During our tests the functions app slows down to a creep and is basically not responding for 20-50 seconds.
image

@NicolasHumann
Copy link

Hi, I have the same kind of issue.

I'm running some Azure Functions with HTTP Trigger. I noticed, very often that some functions take more than 10 secs to load and start my code.

As you can see, I put Trace at the begining and the end of the function. My code takes 2.79sec to run, but the function takes 8sec to load.

My functions use:

  • c# and precompiled dll
  • App Service Plan running on S2
  • Always On is activated
  • The runtime is ~1

Sometimes, it takes 30sec with 100% CPU, just to load the function.

@manoharreddyporeddy
Copy link

manoharreddyporeddy commented Jan 11, 2018

I followed https://github.com/mikezks example, of 5 minutes pinging.

Only change I did was

  1. every 3 minutes ping
  2. hit the root function that pings are other required functions

More info:

  • With timer trigger, have a function, that pings other required functions, every 3 mins.

@mderazon
Copy link

mderazon commented Feb 7, 2018

Is there any way to enable warm start so it will be accessible immidiately ? The periodical ping is a nice workaround but is really a hack.

I am experiencing ~30 seconds warmup times

@manoharreddyporeddy
Copy link

Below are more details:

Start:

  1. There are multiple plans like Consumption Plan, etc.
  2. We can have multiple Function apps under a Plan.
  3. Each Function app can have multiple Functions.

Actual:

  1. The warm-up time is due to cold start, that is function got unloaded/ worker process is killed/down, this warm-up time increases more with more number of functions that you have in a single function app
  2. The otherwise solution is one Function per one Function app, but it defeats purpose of Function app having multiple functions, and also multiple function may cost more (or check if there may be limitations on the number of function apps you could have in a subscription).
  3. This ping workaround of every 3 mins, is required mainly due to the Plan that you are have, example, Consumption Plan or not. These plans can be seen as Blue, Green, other icons in your resources (or like)
  4. Upgrading to other plan returns in much less time, that is because a full virtual machine/container is always running, that defeats purpose of billing of "only runtime of function", so there may be significant costs if you upgrade to next plan, depending on scenarios you use.

Hope that helps.

@mderazon
Copy link

mderazon commented Feb 7, 2018

Thanks @manoharreddyporeddy

The otherwise solution is one Function per one Function app

Even with that it takes 1.2 minute (!) for my function to warm.

screen shot 2018-02-07 at 14 35 36

I have used Lambda and Google Cloud Functions in the past and don't remember having this side effect or having to think about these things at all.

First time I hear about this consumption plan, I will check that out thanks

MS why not keeping things simple 😞

@darshanrampatel
Copy link

@mderazon Just saw this timely article about this: https://blogs.msdn.microsoft.com/appserviceteam/2018/02/07/understanding-serverless-cold-start/

@manoharreddyporeddy
Copy link

Just changing into its own function app, should not change from ~30 seconds to 1.2 minute.

Since, it has reverse effect, sure something else has also changed.

In our experience:

  1. one function in one function app reduces time.
  2. less dependencies/libraries node_modules (when in nodejs) will reduce the time.
  3. also when dependencies are must, we reduce the dependencies, for example, all 100 function file is reduced to 20 function file. (this is cumbersome sometimes, so done as last resort)
  4. load dependencies only when required - there is a global un-intialized variable for each dependency, only when some part of the code requires that dependency, only then it is loaded, and initiaulized with global variable, next time when the global variable already exist, then we can go ahead and use it instead of reloading.
  5. change of Plan has a 100% effect, but has more cost.
  6. ping solution of every 3 mins, with timerTrigger for one function in function app, to ping other functions in in same/other function app always work, however we should note to load all dependencies if time is critical as you have ping always running anyway.
  7. none of the above you like, there is another solution of webjobs (which we dint move to, or, yet)

Hope that helps.

@bhosale-ajay
Copy link

We used webpack to bundle the npm module dependencies into a single file to reduce the cold time, refer this medium article

@mderazon
Copy link

Thanks for the article
@mikezks suggestion to use a time trigger function works like a charm for now. Let's see the bill at the end of the month :-p

@nevercast
Copy link

nevercast commented Jun 9, 2018

The issue seems to be the number of files that get loaded, If you can reduce your file count (even if that means increasing file size) you'll see much better cold start times. The IOPS on the Storage Account File Storage seems to be low compared to its bandwidth.

We used Webpack too, since we have a Node.js application. This also allowed us to move to Typescript with all the ES6 goodies and have babel do the polyfill.

We went with something similar to this: Azure/azure-functions-host#298 (comment) but a bit more elaborate. We are also webpacking our non-http functions too.

Maybe also look at func pack, Azure/azure-functions-host#298 (comment)

Edit: Since you didn't state it, if you're on C#, the relative issue for that is here: Azure/azure-functions-host#838

@ElliotSchmelliot
Copy link

Found a great new article (May 2018) with cold start testing stats for both infrequent and concurrent function calls. The results of this show that creating a "keep-it-warm" trigger for the function does not eliminate cold starts completely, especially for the concurrent function call use case.

Just a heads up.

@jeffhollan
Copy link
Contributor

Closing this as warm start times are so variable we have a best practices doc but nothing that states "expect x cold start"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests