Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

[DataCap Application] < OriginStorage > - <historical and future climate simulations from 1980-2100> <02> #2299

Open
2 tasks
Tom-OriginStorage opened this issue Dec 25, 2023 · 31 comments

Comments

@Tom-OriginStorage
Copy link

Tom-OriginStorage commented Dec 25, 2023

Data Owner Name

UCLA Center for Climate Science

What is your role related to the dataset

Data onramp entity that provides data onboarding services to multiple clients

Data Owner Country/Region

United States

Data Owner Industry

Environment

Website

https://dept.atmos.ucla.edu/alexhall/downscaling-cmip6

Social Media

https://dept.atmos.ucla.edu/alexhall/downscaling-cmip6
https://registry.opendata.aws/wrf-cmip6/

Total amount of DataCap being requested

15PiB

Expected size of single dataset (one copy)

32G

Number of replicas to store

1

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa

Data Type of Application

Slingshot

Custom multisig

  • Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

Using the Weather Research and Forecasting (WRF) model, we directly dynamically downscale multiple global climate models (GCMs) reporting to the 6th Coupled Model Intercomparison Project (CMIP6) from 1980 through 2100 to quantify the climate change signal in high resolution across the western United States (WUS). A 9-km resolution grid encompasses large river basins of western North America, while two 3-km resolution “convection permitting” simulations are performed across the entire state of California and most of Wyoming. We have produced three tiers of data from our simulations to serve a range interested users, including 21 hourly variables and 30 daily variables. Please contact Stefan Rahimi to access and for more information.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Using the Weather Research and Forecasting (WRF) model, we directly dynamically downscale multiple global climate models (GCMs) reporting to the 6th Coupled Model Intercomparison Project (CMIP6) from 1980 through 2100 to quantify the climate change signal in high resolution across the western United States (WUS). A 9-km resolution grid encompasses large river basins of western North America, while two 3-km resolution “convection permitting” simulations are performed across the entire state of California and most of Wyoming. We have produced three tiers of data from our simulations to serve a range interested users, including 21 hourly variables and 30 daily variables. Please contact Stefan Rahimi to access and for more information.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

Singapore

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

First, i will organize the local data and download data from the cloud via aria2c .
Then, i will prepare car files by singularity and store the data mapping in MongoDB.
Finally, i will make deals with SPS by singularity or boost.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

The complete stored list was not found. Currently our partner wants to store a copy.

Please share a sample of the data

1.4P, https://registry.opendata.aws/wrf-cmip6/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

2 to 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server

How do you plan to choose storage providers

Slack, Big Data Exchange

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

f02838518 akcd4040 [email protected] bitwind Russia
f02832475 Lee [email protected] HS88 Thailand
f02859053 miaozi [email protected] chainup USA
f02830321 OriginStorage [email protected] OriginStorage Vietnam
f02837226 Jerry [email protected] kinghash Britain

How do you plan to make deals to your storage providers

Boost client, Lotus client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

Copy link

Thanks for your request!
Everything looks good. 👌

A Governance Team member will review the information provided and contact you back pretty soon.

@Sunnyiscoming
Copy link
Collaborator

If you answered "Other" in the previous question, enter the details here
No response

If you are a data preparer. What is your location (Country/Region)
None

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
No response

If you are not preparing the data, who will prepare the data? (Provide name and business)
No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
No response

Please supplement the questions listed above.

@Tom-OriginStorage
Copy link
Author

If you are a data preparer. What is your location (Country/Region)
Singapore

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
First, i will organize the local data and download data from the cloud via aria2c .
Then, i will prepare car files by singularity and store the data mapping in MongoDB.
Finally, i will make deals with SPS by singularity or boost.

If you are not preparing the data, who will prepare the data? (Provide name and business)
No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
The complete stored list was not found. Currently our partner wants to store a copy.

Please share a sample of the data
1.4P, https://registry.opendata.aws/wrf-cmip6/

@Sunnyiscoming Added thank you

@Sunnyiscoming
Copy link
Collaborator

Hello, per the filecoin-project/notary-governance#922 for Open, Public Dataset applicants, please complete the following Fil+ registration form to identify yourself as the applicant and also please add the contact information of the SP entities you are working with to store copies of the data.

This information will be reviewed by Fil+ Governance team to confirm validity and then the application will be allowed to move forward for additional notary review.

@Sunnyiscoming
Copy link
Collaborator

SP List provided:
[{"providerID": "f02838518","City": "XYZ", "Country": "Russia", "SPOrg","bitwind"},
{"providerID": "f02832475","City": "XYZ", "Country": "Thailand", "SPOrg","HS88"},
{"providerID": "f02859053","City": "XYZ", "Country": "USA", "SPOrg","chainup"},
{"providerID": "f02830321","City": "XYZ", "Country": "Vietnam", "SPOrg","OriginStorage"},
{"providerID": "f02837226","City": "XYZ", "Country": "Britain", "SPOrg","kinghash"},]

@Tom-OriginStorage
Copy link
Author

image

@herrehesse
Copy link

Total amount of DataCap being requested
10PiB

Expected size of single dataset (one copy)
32G

Number of replicas to store
1

Why do we allow this?

@herrehesse
Copy link

[{"providerID": "f02838518","City": "XYZ", "Country": "Russia", "SPOrg","bitwind"},
{"providerID": "f02832475","City": "XYZ", "Country": "Thailand", "SPOrg","HS88"},
{"providerID": "f02859053","City": "XYZ", "Country": "USA", "SPOrg","chainup"},
{"providerID": "f02830321","City": "XYZ", "Country": "Vietnam", "SPOrg","OriginStorage"},
{"providerID": "f02837226","City": "XYZ", "Country": "Britain", "SPOrg","kinghash"},]

Most of the above not real & VPN abuse.

@Tom-OriginStorage
Copy link
Author

@herrehesse
First, please give your evidence.

Second, please list the websites used to check the SP address location. For the sake of fairness, I will use your website to check your 2295. I hope your website can withstand the test.

Third, there are many dedicated address detection tools, I hope you can use them. ipqualityscore: https://www.ipqualityscore.com/vpn-ip-address-check;
MaxMind: https://www.maxmind.com

@Tom-OriginStorage
Copy link
Author

@Sunnyiscoming
As a well-known storage provider on the Internet, Open Source has served customers with a total computing power of 2EiB in the past three years. All of our LDNs are fully compliant, and there have been no problems with cid sharing or unreasonable data backup. The search is also up to standard. Strictly speaking, our quality is much higher than 80% of our peers. Please tell me why 2295 passed without asking any questions, but why haven’t we passed it yet?

@Tom-OriginStorage
Copy link
Author

@Sunnyiscoming Any other questions?

@herrehesse
Copy link

As a well-known storage provider on the Internet, Open Source has served customers with a total computing power of 2EiB in the past three years.

How much of this is actually retrievable and stored with a hot copy? I guess near 0%.

Stop abusing this program tom.

@herrehesse
Copy link

[{"providerID": "f02838518","City": "XYZ", "Country": "Russia", "SPOrg","bitwind"},
{"providerID": "f02832475","City": "XYZ", "Country": "Thailand", "SPOrg","HS88"},
{"providerID": "f02859053","City": "XYZ", "Country": "USA", "SPOrg","chainup"},
{"providerID": "f02830321","City": "XYZ", "Country": "Vietnam", "SPOrg","OriginStorage"},
{"providerID": "f02837226","City": "XYZ", "Country": "Britain", "SPOrg","kinghash"},]

ALL under VPN - I dare to put my hand in the fire claiming these are all the same entity. Prove me wrong.

@Tom-OriginStorage
Copy link
Author

@herrehesse
First, please give your evidence.

Second, please list the websites used to check the SP address location. For the sake of fairness, I will use your website to check your 2295. I hope your website can withstand the test.

Third, there are many dedicated address detection tools, I hope you can use them. ipqualityscore: https://www.ipqualityscore.com/vpn-ip-address-check;
MaxMind: https://www.maxmind.com/

@herrehesse
Copy link

You can not provide clear evidence for the location of these nodes. Because they are using VPN to falsely show distribution.

@Tom-OriginStorage
Copy link
Author

@herrehesse
Take out your evidence

@Tom-OriginStorage
Copy link
Author

@Sunnyiscoming Please review my project and if there are no other issues, please approve it

@Sunnyiscoming
Copy link
Collaborator

Datacap Request Trigger

Total DataCap requested

15PiB

Expected weekly DataCap usage rate

1PiB

Client address

f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa

Copy link

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzaceacw2ahud5bgwqzy5adruttuj2ltfj5f53mblel7cmtfrqydsczym

Address

f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa

Datacap Allocated

1.00PiB

Signer Address

f12mckci3omexgzoeosjvstcfxfe4vqw7owdia3da

Id

5a8e424e-7d0d-44a8-abec-9b0eb2cfb491

You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceacw2ahud5bgwqzy5adruttuj2ltfj5f53mblel7cmtfrqydsczym

Copy link

DataCap Allocation requested

Request number 5

Multisig Notary address

f02049625

Client address

f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa

DataCap allocation requested

2PiB

Id

aa4c155e-03da-4e00-b174-d8f744bab3aa

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests