-
Notifications
You must be signed in to change notification settings - Fork 262
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about performance (big datasets) #952
Comments
Generally, what's in the Optimization guide all apply in the client library as well. You might probably get suggested by the forum agents already. Make sure you check this out too.
How large is the result of your request? |
Hi! Request of keywords is around 30k - 45k requests to API (because of big amount accounts of our little clients), so that's a headache with php scripts :) Thanks for links, I'll check it out |
I mean how many rows (in case of searching) or how many campaigns do you mutate per request--the one that you mentioned it takes 1-2 seconds? |
Well, unfortunately, its difficult to say about how request in case of keywords, but for getting budgets case - its ~1 second per 1 account. So, idk, is it possible to rewrite it to make possible to take data from several accounts per 1 request? How do you think? |
@AlexRyabikov you could use non-blocking |
@hakimio hi! Thanks, sounds good for me. I'll check it out :) |
This can be closed now, since |
Hi!
Team from Google ADS API forum send me here :)
I have a question about optimization: Our project is an advertising agency, and we use an API, which has been working fine for us. However, we now have a task to retrieve large sets of data (keywords, budgets) for building reports in BI. Unfortunately, the functions in the Google Ads PHP library that we are using are showing very low performance (1 request takes 1-2 seconds. With volumes in the tens of thousands of clients, this is a very resource-intensive operation).
Could you please suggest alternatives for solving this issue?
The text was updated successfully, but these errors were encountered: