Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Community Review of QC Data Center Allocator #186

Closed
luhong123 opened this issue Oct 8, 2024 · 8 comments
Closed

Community Review of QC Data Center Allocator #186

luhong123 opened this issue Oct 8, 2024 · 8 comments

Comments

@luhong123
Copy link

Latest Compliance Report: https://compliance.allocator.tech/report/f03014608/1728349006/report.md

luhong123/QC-Data-center#3

luhong123/QC-Data-center#11

KYC exists

Retrievals look good for SP and distribution looks good

@filecoin-watchdog
Copy link

First Review
Compliance Report
Allocator Application

DC granted to two clients on this round:
New client
Ongoing client

The ongoing client was covered in the previous report. Nothing has worsened since then.


The first thing that catches the eye of the new client is that one dataset was declared at 1.32 PiB. 8 replicas give about 10PiB, while the client asked for 5PiB.

This dataset was stored several times before on the Filecoin network. Why was another copy needed?

The allocator asked additional questions and asked for geographic information of one of the SPs.

SPs list provided:
f03100009 Hong Kong
f01084941 Hong Kong
f03100002 Shenzhen
f03100000 Shenzhen
f01975299 Shenzhen
f01084413 Changsha
f03161261 New York
f03192503 Hong Kong
f03189917 Dulles

SPs used for deals:
f03100009
f03216485
f01084941
f03100002
f01975299
f03100000
f01084413
f03161261
f03189917
f03214920
f03192503

The provided list of SPs includes 9 IDs, while the list of SPs used for deals includes 11 (the last two on the latter list weren’t listed originally).


In general, the Allocator keeps asking questions when something seems off and runs CID Reports often.

@luhong123
Copy link
Author

This dataset was stored several times before on the Filecoin network. Why was another copy needed?

I'd like to know how I'm supposed to check the records for this.

The provided list of SPs includes 9 IDs, while the list of SPs used for deals includes 11 (the last two on the latter list weren’t listed originally).

I asked the client yesterday and he has explained accordingly
image

But considering all the reasons for this new client, I will close this DataCap Application and stop distributing the remaining DCs
image

In general, the Allocator keeps asking questions when something seems off and runs CID Reports often.

Since the previous week was China's National Day, I was away from work for about 10 days, so I'll pay attention to that later on

@luhong123
Copy link
Author

@filecoin-watchdog Thank you for your review

@filecoin-watchdog
Copy link

@luhong123

I'd like to know how I'm supposed to check the records for this.

It might be done in several ways:

  1. Through allocator.tech, by searching for dataset owner:
image
  1. By searching in this archive repo: filecoin-plus-large-datasets for dataset owner, storage link, dataset website etc. It requires careful analysis but gives a room to ask additional questions to clients.
Screenshot 2024-10-09 at 11 37 08

@luhong123
Copy link
Author

@filecoin-watchdog Got it,thanks

@galen-mcandrew
Copy link
Collaborator

In addition to the above tools, we also see allocators using this client database and searching for the client name. We understand this process is not ideal, and we would love to see some additional tools or proposals from the community for improvements.

Additionally, checking this larger client as well as the other smaller client, and seeing some duplicate data issues with SPs.

Despite these issues, we are generally seeing good compliance and diligence on this pathway. We are requesting that the allocator verify that they will uphold all aspects & requirements of their initial application, with some extra focus on these areas:

  • continuing to improve retrieval rates
  • investigating overall dataset size, replicas, and total DataCap requested calculations to avoid unnecessary sector padding
  • investigating data prep details from clients

If so, we will request an additional 10PiB of DataCap from RKH, to allow this allocator to show increased diligence and alignment.

@luhong123
Copy link
Author

Will follow up on these issues, thanks!

@Kevin-FF-USA
Copy link
Collaborator

DataCap has been refilled.
https://datacapstats.io/notaries/f03014608

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants