-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discussion - Onboarding projects with large DataCap requirements #94
Comments
My preference tends to be for 1.b . It has the advantage to avoid bottleneck when onboarding a large client without impacting the notaries day to day allocation.
Happy to clarify if its not clear enough :) |
Hi folks - proposing the following for getting the discussion going on potential implementation paths: (Let's define "large client" as a project/use case/Client needing > 500 TiB of DataCap.)
This specifically allows for:
@jnthnvctr @s0nik42 thoughts? |
@dkkapur , I like the proposition, I think that type of client will need a single point of contact in any cases to deal with Fil+. I will recommend that we identified one of the 7 notaries to pick up that role when the project start. Actions could be :
|
@s0nik42 thanks - agreed, we should have a single notary-lead chosen from the set as well. For an initial version of this type of faucet, I would suggest we scope this to the following to ensure that we are on a safe path to test and unblock projects such as Starling without creating too much of a risk for the Fil+ program:
What do you think of this? IMO erring on the side of caution early on to ensure we build safe practices for scaling this up with confidence in the future is a good way to proceed. |
Hi @dkkapur, I think this is very good to start with. |
@s0nik42 thanks! We've had various conversations in Slack and offline with interested Clients and Notaries on this one in the last two weeks, so in tomorrow's Notary Governance call - let's finalize the approach for the initial proposal! Recommending that we move forward based on the following (updating the bullets I shared above):
These updates enable a more nuanced approach to be taken with what is deemed "fair" or "reasonable" by a select set of Notaries that are then comfortable tracking and enforcing this. We should focus efforts towards building tooling that will help bring transparency into the system to ensure DataCap is being used to make Filecoin more useful! Tactical next steps include:
|
Draft of some of the questions that need to be included in the client application. Current plan is to manage these applications in a separate repo, i.e.,
|
Hey Deep, for the required materials on the application, you can use our guidelines for reference, which we believe can better handle big clients. |
@Fenbushi-Filecoin - this is great, thank you for sharing! I will look through and propose some changes to this application structure. If there are any specific questions that from your experience have proven to be valuable, please let me know. |
Per the call this week (2021-04-27 governance call), we're working on getting a v1 implementation of this up and running in the next few weeks! I will keep this issue updated with progress. |
@Fenbushi-Filecoin - thanks again for sharing your comprehensive DataCap allocation writeup! Here are some things I think we should consider incorporating into the application:
|
Took a deeper dive today into potential sources of issues in a system of this sort, and would like to propose the following in addition to all the above listed points. This is largely in an effort to serve an initial set of datasets that we can use to prove out the process with as we start to move to a larger scale of DataCap allocation and distribution.
|
Update: per the conversation in the last notary governance call, https://github.com/filecoin-project/filecoin-plus-large-datasets has been set up to start testing this process out! |
@dkkapur Hi, there might be some problems with information of project details when submitting New issue of large-datasets application. Please check it. |
This topic continues to evolve as part of the broader LDN theme / path to DataCap. As such, closing out this issue for now. |
There are a few ongoing projects that have substantial DataCap requirements - over and above what exists in the ecosystem today.
...
Inevitably there will be more
This issue is to kick off discussion about the ways in which as a community we can plan and support early use cases.
To disentangle two issues that I believe arise here:
Early in this program we have limited amounts of DataCap in the ecosystem - though in a slightly more mature state this may not be a limitation. I believe there are two approaches here:
a) In the rubric today, over subsequent allocations and elections Notaries increase their DataCap allocation - so it is possible that we simply run the process as it exists today and just hold many rounds of elections successively.
b) An alternate approach is to define a process (while Notaries have less DataCap than the projects) where these projects can apply to the community to receive an allocation that is purpose built to allocate to this use case (and administered by a set of the existing Notaries). The benefit here is that other use cases that apply in this timeframe for DataCap would not be blocked.
No single Notary would be able to service either of these projects properly (or other large scale ones). My proposal here would be that is actually fine - and Notaries should collaboratively support large scale efforts (which also will require additional scrutiny to make sure the Client is using the DataCap appropriately).
The text was updated successfully, but these errors were encountered: