Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIP Discussion: Remove Fplus Notary Process #203

Closed
Fatman13 opened this issue Nov 3, 2021 · 14 comments
Closed

FIP Discussion: Remove Fplus Notary Process #203

Fatman13 opened this issue Nov 3, 2021 · 14 comments

Comments

@Fatman13
Copy link
Contributor

Fatman13 commented Nov 3, 2021

Summary

Remove centralized notary process and develop new decentrailized/algorithmic notary process

Motivation

Current datacap program not only disrupts a free storage market by creating manufactured demand but also hinders the actual adoption of filecoin as a viable alternative to traditional centralized storage services. Datacap program as it is implemented now is centralized, ineffective (at scale) and has no incentive/punitive framework in place for the governance body (i.e. notaries) rendering them unproductive in carrying out governance policies.

From the design of the filecoin storage market, SPs get portion of their revenue from providing retrieval services. As opposed to genuine market demand for storage, fabricated demand for storage won't incentivize SPs to optimize their retrieval capabilities as most data from datacap program has no commercial value at all. This stifles the development for building a competitive user experience (on-par with centralized services) for real-world storage use cases.

Ultimately, the program failed to align the interests of storage providers with the long-term goals of the network which is building a free storage market (free as in free market) over the span of a relatively long period of time (~1 year), during which 4PiB of datacap are sealed by storage providers according to fplus dashboard compared to 14EiB sealed by the whole network. I'd suggest cut the program loose or at least give it a thorough evaluation before it become "too big to fail" starting from removing centralized notary process.

As current LDN goes, verified deals basically made regular deals irrelavant. The network might as well just subsidize deal making directly on the protocol level instead of having applications go through a centralized and ineffective process.

Current Design

Current design of the notary process (centralized) fell short on the following

  • Most Notaries are both player and referee
  • Centrailized process either being slow and ineffective or fast with no audits (it doesn't scale well)
  • No dispute/audit that I am aware of (Please correct me if I am wrong)
  • Not helping real-world adoption as not many real-world clients would actually want to go through the datacap apllication process (It would be more useful for platform like nft.storage to get datacap and use it to subcidize storage but getting real world clients to do that is too ideal)
  • No way to verify private datasets
  • No way to determine if a dataset is useful or not
  • No content moderation (NSFW)
  • Rules around the centralised process that some may find hard to follow

Therefore, I propose allowing everyone any amount of datacap and maybe lowered the multiplier (10x right now), or develop other ways to subsidize datacap deals.

New Design

Premises

New design should adhere to the following premises...

  • Free market principles (as it was presented in here)
  • For real-world adoption
  • Decentralized process

Ideas

New design could incorporate the following ideas...

  • Remove current centralized datacap application process
  • Reward regular deal making with a multiplier
  • Reward "datacap" deal each time network successfully retrieve the deal data
    • something close to off-chain windowPost dispute logic but instead of slashing, it reward successful retrieve;
    • no slashing if fails;
    • facilitate for better user experience for storage client;

Consideration

Deal making is already bearing some extra cost associated with message gas and operation overhead. I think it is reasonable to remove current notary process and subsidize deal making directly until a decentralized process is developed, leaving most of it to the whim of the free market. If a dataset is useful then there should be a high market demand to retrieve such data as opposed to the usefulness of a dataset being determined by the personal preference of a notary.

@Fatman13 Fatman13 changed the title FIP Discussion: Remove Notary process FIP Discussion: Remove Notary Process Nov 3, 2021
@Fatman13 Fatman13 changed the title FIP Discussion: Remove Notary Process FIP Discussion: Remove Fplus Notary Process Nov 3, 2021
@cryptowhizzard
Copy link

I don't think this is the way to go.

  • Most notary' s i know work in Filecoin affiliated settings and voluntier to give back something usefull to the network.
  • If there is a dispute there are 14 days recurring notary calls to discuss.
  • We have had talks with University' s in our region. Non of them have opposed to do some kind of duedilligence.

In order to decide if a dataset is usefull for Filecoin i guess there have to be humans involved and not an automated process. At least a combination of both?

@Fatman13
Copy link
Contributor Author

Most notary' s i know work in Filecoin affiliated settings and voluntier to give back something usefull to the network.

Yes, I believe most of notaries are do-gooders but in a centralized process there only need to be one bad actor to corrupt the whole system whereas a decentralized approach could be more resistant.

If there is a dispute there are 14 days recurring notary calls to discuss.

I didn't follow the process as closely. But my point is that there seems to be no audits by notary finding anything yet.

We have had talks with University' s in our region. Non of them have opposed to do some kind of duedilligence.

University datasets are good. But for real-world adoption that is not enough in my humble opinion.

In order to decide if a dataset is usefull for Filecoin i guess there have to be humans involved and not an automated process. At least a combination of both?

Likely. My point is current centralized approach is far from ideal and not really geared towards real-world adoption. That's what I am worrying about. Web3 only on paper not in practice...

@dkkapur
Copy link
Contributor

dkkapur commented Nov 11, 2021

@Fatman13 thanks for drafting this, good to have active discussion on how we can continue evolving the Fil+ system, adding a few of my thoughts on the points you raised.

Specifically, for your list of bullets on current status quo:

Most Notaries are both player and referee

Agree - though, even if we engage active community members that aren't storage providers or clients in the network, a majority of active participants are active because they also have a stake in the network. I don't think this is necessarily a problem -> if we have enough Notaries with varied interests/stakeholder positions in the network, we should theoretically be able to converge on rules and evolve them over time to ensure reasonable understanding of "fair" or "good" behavior vs. abuse of the system. However, building a system where we do engage enough notaries and enable them to help build this out is important, and is useful feedback for the current Fil+ setup.

Centrailized process either being slow and ineffective or fast with no audits (it doesn't scale well)

"Centralized" because its either one notary or a small group of notaries? Audits do happen - in the past this has usually been at notary elections, and I expect this to happen a lot more frequently with subsequent allocations in LDN applications coming through / being tracked better on dashboard and in analysis.

No dispute/audit that I am aware of (Please correct me if I am wrong)

There's been a couple, but the process is still young, most folks either flag issues in calls/Slack/or GitHub, and then work with the governance team to try and root cause it + identify and propose next steps to the rest of the group in the following governance call.

Not helping real-world adoption as not many real-world clients would actually want to go through the datacap apllication process (It would be more useful for platform like nft.storage to get datacap and use it to subcidize storage but getting real world clients to do that is too ideal)

Agree - the current client onboarding process is painful for clients. Approach IMO should be to improve this client onboarding UX, and this is a current priority for the program! On the second point, deal brokers like Estuary and Textile's bidbot have received DataCap and are using it in the manner you described.

No way to verify private datasets

Disagree on this - several notaries have methods for doing this to build confidence in the dataset / client. In the last governance call, a new method was also proposed to do this for LDN applications, so this is actively being thought about.

No way to determine if a dataset is useful or not

Would love to hear more on this one - is this more about there not being a consistent standard on what is worthy of DataCap or something else?

In general - I think that the approach should be more focused on:

  • making the experience better for clients
  • building better tooling that improves the due diligence / KYC process for clients and enables notaries to make more accurate decisions faster
  • biasing towards given less DataCap faster to clients and doing analysis on actions on chain post DataCap received + deals being made, and using this as a method of validating client trustworthiness
  • scaling up notaries and better defining the role as a "guardian" or "watchperson" rather than a gatekeeper of the system

These are aligned with our current goals in the Fil+ community and would result in a much more productive system to support clients in the network that establish the trust required to minimize the friction for them to onboard data onto Filecoin.

Would love to see you at the next governance call to discuss this further as well!

@Fatman13
Copy link
Contributor Author

@dkkapur, Thank you for replying!

There's been a couple, but the process is still young, most folks either flag issues in calls/Slack/or GitHub, and then work with the governance team to try and root cause it + identify and propose next steps to the rest of the group in the following governance call.

I am not sure if the speed of governance call resolving issues could catch up the speed of LDN allocation. But we shall see.

Audits do happen - in the past this has usually been at notary elections, and I expect this to happen a lot more frequently with subsequent allocations in LDN applications coming through / being tracked better on dashboard and in analysis.

We see rules broken by projects in Slingshot and windowPost submitted on-chain disputed by other storage providers. Whether intentional or unintentional, there has to be some misbehaviour with PiBs of datacap allocated and we see none of such reports. That is concerning as to fairness of centralized notary process.

I don't think this is necessarily a problem -> if we have enough Notaries with varied interests/stakeholder positions in the network, we should theoretically be able to converge on rules and evolve them over time to ensure reasonable understanding of "fair" or "good" behavior vs. abuse of the system.

It becomes a problem: firstly, we see no reports on misbehavior; secondly, there is no clear exit route for notaries. When storage not proven for days, storage providers get slashed. But no clear punitive measures are implemented for a centralized notary process.

Disagree on this - several notaries have methods for doing this to build confidence in the dataset / client. In the last governance call, a new method was also proposed to do this for LDN applications, so this is actively being thought about.

If we have ways to verify private data then we can/should develop algorithmic ways to verify them, making notaries irrelevant.

Would love to hear more on this one - is this more about there not being a consistent standard on what is worthy of DataCap or something else?

I personally think whether a dataset is useful should be determined by the market not by notaries. One metric could be if a dataset is useful, a lot of people will retrieve such data on a regular basis.

  • making the experience better for clients
  • building better tooling that improves the due diligence / KYC process for clients and enables notaries to make more accurate decisions faster
  • biasing towards given less DataCap faster to clients and doing analysis on actions on chain post DataCap received + deals being made, and using this as a method of validating client trustworthiness
  • scaling up notaries and better defining the role as a "guardian" or "watchperson" rather than a gatekeeper of the system

All great points and totally agree with you. Given the premise that a decentralized process is better than centralized one, I think there should be rules that when A, B and C are realized, then current centralized notary process should be dissolved. Or within some time framework, A, B and C are not realized, current centralized notary process should also be dissolved or at least revaluated its effectiveness. One concern is that when instituion grow bigger, it exhaust means to perserve itself and it hunker down and defer progresses.

@Helentaylorff
Copy link

How about we add a layer of mechanism that maintains the current notary system while making the whole DataCap allocation process more open and transparent.

Here is my idea:
Each customer requesting DataCap is required to submit a ”50G“ or ”32G“ public file on the Web.Storage platform.
The notary must confirm the completion of these data samples before the quota can be approved.
In the past we didn't have the ideal platform to display sample data, but now we have an excellent platform such as Web.Storage.
I do not think this would be particularly costly and would enable the problem to be solved effectively.

@dayou5168
Copy link

While the current notary process isn't perfect, I don't think notary and datacap allocations can be eliminated until a better incentive method is found for sp to store data. I think the most important thing that needs to be done is still to build a strong group of clients and distribute a lot of datacap to them. when they are extremely short of SPs and are looking for SPs to store data in the slack channel, maybe that's what we want.

@dkkapur
Copy link
Contributor

dkkapur commented Nov 20, 2021

@Fatman13 - thanks, few follow ups:

We see rules broken by projects in Slingshot and windowPost submitted on-chain disputed by other storage providers. Whether intentional or unintentional, there has to be some misbehaviour with PiBs of datacap allocated and we see none of such reports. That is concerning as to fairness of centralized notary process.

I'm especially interested in the latter case of issue (wPoSt disputes). What do you think is missing in the pipeline for reporting such issues for review by the Fil+ community? Would an anonymous issue collection form be a step in the right direction? Separately, is the "centralized" notary process here the selection aspect or the reliance on individual notaries to carry out their own due diligence/follow up process which is not consistent/distributed among more entities/notaries?

It becomes a problem: firstly, we see no reports on misbehavior; secondly, there is no clear exit route for notaries. When storage not proven for days, storage providers get slashed. But no clear punitive measures are implemented for a centralized notary process.

Agree with this - in general, notary engagement / service levels are an active topic of discussion at the last few governance call. I expect that with the continued investment in transparency in the system, this will lead to notary reputation being tracked better and resulting in limitations on on-chain actions. Additionally, I do think we need to evolve the notary selection process a fair bit, perhaps we should explore staking or other alternatives that result in "more skin in the game" and better incentive alignment? Would really appreciate if you can kick of a Discussion here https://github.com/filecoin-project/notary-governance/discussions if this is something you are interested in.

If we have ways to verify private data then we can/should develop algorithmic ways to verify them, making notaries irrelevant.

I don't think the point of the notaries will be manual verification of data long term. The role will instead evolve into policy setting / becoming a guardian for the network, where even if software can carry out due diligence, notaries will be the primary stakeholders in setting the criteria for how the software reacts to info about a client, data, on-chain behavior, etc.

I personally think whether a dataset is useful should be determined by the market not by notaries. One metric could be if a dataset is useful, a lot of people will retrieve such data on a regular basis.

ACK - agree on this. Difficult to track given retrievals are not currently track on-chain at the L1 level at least. Might be worth thinking about ways in which we can track this off-chain, at least for Fil+ verified deals. ^ Same note as above for kicking off a Discussion if this is something you'd be interested in furthering the conversation on.

Given the premise that a decentralized process is better than centralized one, I think there should be rules that when A, B and C are realized, then current centralized notary process should be dissolved. Or within some time framework, A, B and C are not realized, current centralized notary process should also be dissolved or at least revaluated its effectiveness. One concern is that when instituion grow bigger, it exhaust means to perserve itself and it hunker down and defer progresses.

Agree with this too - I think we are going through a period of good change in the Fil+ system already and will continue to for the next several months as we continue to make bigger changes to evolve the program and meet its goals of delivering a better client experience + ensuring the network continues to be leveraged as usefully as possible. Will flag re-evaluation of processes as part of our general goals we are tracking towards for Q1 of next year.

@Fatman13
Copy link
Contributor Author

Fatman13 commented Nov 22, 2021

@dkkapur again, thanks for reply!

I'm especially interested in the latter case of issue (wPoSt disputes).

Yes. A decentralized / algorithmic approach, off-chain windowPost.

What do you think is missing in the pipeline for reporting such issues for review by the Fil+ community? Would an anonymous issue collection form be a step in the right direction?

These all ideally would work, but requires a lot of manual work and proper incentive structure, which goes back to my original point that current centralized notary process doesn't scale well.

Separately, is the "centralized" notary process here the selection aspect or the reliance on individual notaries to carry out their own due diligence/follow up process which is not consistent/distributed among more entities/notaries?

Both, I think. I didn't follow the notary process as closely, but from what I observe on the ground number of notary seats should really correspond to the relative storage power of each region. What use of datacap if one region can not find enough SP to take the datacap they have applied? Considering the fact that Intercontinental data transfer is unreliable and each region has different regulation on data storage. And to your second aspect, I think most notaries have their busy day job and the idea that notary would carry out due diligence/follow up with no incentive/punitive structure implemented is highly questionable. Apart from these two aspects, a centralized process also creates "politics", which means a lot of nuances are involved in applying and approving processes.

However, all above are really besides the point. I personally think the data on-boarding process should be just left to the whim of the free market, which means removing current centralized notary process until a decentralized approach is developed while having adjusted power multiplier to incentivize regular deals. By doing so, filecoin network could potentially unlock a new wave of growth which is aligned with current goal of SP supporting team as I understand it when I talked to them. "A free market for storage", that is what filecoin was advertised for during tons of occasions last year. Unfortunately, right now the "free" market of storage is dictated by datacap, which has NO free market elements at all.

ACK - agree on this. Difficult to track given retrievals are not currently track on-chain at the L1 level at least. Might be worth thinking about ways in which we can track this off-chain, at least for Fil+ verified deals. ^ Same note as above for kicking off a Discussion if this is something you'd be interested in furthering the conversation on.

I am not sure about the technical details either. But AFAIK retrieval request require payment, and all payments are on-chain. If we have the deal CID included in the payment, then in theory it could be track from on-chain data.

Agree with this too - I think we are going through a period of good change in the Fil+ system already and will continue to for the next several months as we continue to make bigger changes to evolve the program and meet its goals of delivering a better client experience + ensuring the network continues to be leveraged as usefully as possible. Will flag re-evaluation of processes as part of our general goals we are tracking towards for Q1 of next year.

Cool!

@MegTei
Copy link

MegTei commented Dec 7, 2021

This FIP and solid arguments by @Fatman13 and @dkkapur are not new. We know the process needs improvement and have been discussing for many months, Lets move forward and introduce some change now. There are several issues/ FIPs/ discussions covering many of the same themes. How might we seek consensus on the issues that are affecting the entire value chain?
Its possible through designing the end to end journey we will discover/ explore issues and solve those through MVPs, seek feedback, iterate and so on.

@Fatman13 is correct, we are manufacturing the market, however with <10% adoption we arent doing a great conversion job and possibly dis-incentivising non datacap growth.

A Design Thinking mindset could deliver efficiency & improvement across the entire spectrum of:

  1. Form consensus on the DAO rules/ manifesto (things like what is quality data, how to tell, notary roles & responsibilities, client responsibilities etc)
  2. Map the experience journey across all actors - Clients, notaries, storage providers, brokers etc. This is aligned with the manifesto but "lives/ changes" as the product roadmap and notary process does. It surfaces problems and opportunities which are validated back with the users
  3. Prioritise the automation that can speed up decisions/ raise flags
  4. Design/ run/ test a notary incentive MVP
  5. Design/ run/ test a SP incentive MVP that scales/ grades reward from verified data through to private data.
    Some of the above are already WIP, can we have more clarity on the timeframe for not just a plan in Q1, but some action also?

@Fatman13
Copy link
Contributor Author

Fatman13 commented Dec 8, 2021

Hello @MegTei, thanks for reply!

We know the process needs improvement and have been discussing for many months, Lets move forward and introduce some change now.

Yes. I think the key here is time framework. Like filecoin, datacap program has been around for ~1 year and many of its fundamental problems like scaling, incentive structure, data verification and etc. are still not properly addressed. I suggest that there should be clear targets that when the program fail to realize A, B and C, then the current centralized notary process should be dissolved. Or within some time framework, A, B and C are not realized, current centralized notary process should be at least revaluated its for effectiveness. Thus making the network adapt to real-world environment more quickly.

How might we seek consensus on the issues that are affecting the entire value chain?

A vote with storage power could be used for consensus maybe?

@kaitlin-beegle
Copy link
Contributor

Closing this issue for now!

@Fatman13, it sounds like you're moving towards a more collaborative discussion with @MegTei and @dkkapur. I would suggest you open a new issue in the FilecoinPlus governance repo. FilecoinPlus handles its own governance and program management; no FIP is needed in order to change program parameters. For more information, see FIP0003.

@Fatman13
Copy link
Contributor Author

Fatman13 commented Jan 7, 2022

Hello, @kaitlin-beegle is it possible to still keep this thread open in the discussion forum? As Fil+ is introduced by FIP, i think the removal of centralized notary process might still be a FIP discussion instead of a governance one?

@kaitlin-beegle
Copy link
Contributor

@Fatman13 I think this is something we could have a good conversation about! From my perspective, however, FIP0003 is quite explicit in outlining who governs what for FilecoinPlus.

Simply speaking, anything about the mechanisms or procedures of the FilecoinPlus program are iterated within the program itself. The only thing that would require a FIP would be a change to the stated vision or objective of the program, which this proposal does not seek to do. FilecoinPlus governs its own notary process and has its own procedures for implementing the kind of proposal that you've suggested.

@dkkapur and @galen-mcandrew do you guys agree?

@kaitlin-beegle
Copy link
Contributor

(Oh, additionally, we should point people in the FIPs Discussion Forum to the FilecoinPlus Discussion Forum as well. It's good governance wayfinding for these things to reference each other.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants