-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gfs is taking a long time #166
Comments
I can see in the data that this data consumer does trim the data down to |
I just ran this locally and it seem to take about ~10 seconds python
This makes me think perhaps the consumer deosnt have enough memory, and is therefore running slow |
Ive tried upping the memory from 5GB to 6GB, but does seemed to make much difference right now |
I will try upping CPU from 1 GB to 2GB, as the logs shows a log of CPU usage |
One other option is to save the file post mapping, and then this would only need to be done once |
roughly the time seems to be halved when adding more compute |
It looks like the loading takes the longest time, other steps are quick. |
Perhaps there is a good way to check if the raw datafile is already been downloaded into s3, and if so, download from there, not from GFS server. Similar for other NWP providers |
Detailed Description
it looks like downloading the data doesnt take to long but mapping it to xarray does
I think the code bit is here - https://github.com/openclimatefix/nwp-consumer/blob/main/src/nwp_consumer/internal/inputs/noaa/aws.py#L108
Context
Possible Implementation
The text was updated successfully, but these errors were encountered: