Increase the ceiling on large dataset applications #227
galen-mcandrew
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Trying out a new format for discussing ideas and starting proposals! I think this may let community members reply directly on comments. This could let us start more conversations, and then once someone has heard from the community and refined their idea, they could open it as an issue with a specific proposal. Let's try it out!
📈 The ceiling is too low!
Currently, the large dataset process has a ceiling of 5 PiB per application. This was a starting point to help determine a safe upper bound for launching the large dataset flow. Right now, allocations happen over time with this process (rather than the entire pebibytes at once), and it still makes sense to have a ceiling.
That said, we already have some clients that know they need more than 5PiB of DataCap! We are specifically seeing this where clients with already large datasets are planning to store 5+ copies across different storage provider regions.
This is a great problem for us to have because it means there is more potential deal demand, and we want to make sure we are not losing these high-volume clients by requiring them to re-submit an application.
I think we should increase the upper limit on the large dataset applications to 20 PiB, knowing that we are still making smaller allocations over time.
This wouldn't change anything that we are experimenting with from issue 217, and all current and new clients could calculate their total requested DataCap (taking into account needs for copies) and update their applications.
I would love to hear what the community thinks about this idea! Notaries, do you think this change will help us retain & accelerate some very large client deal flows?
Beta Was this translation helpful? Give feedback.
All reactions