-
Notifications
You must be signed in to change notification settings - Fork 93
1165 create initial conduit binary #1167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1165 create initial conduit binary #1167
Conversation
Resolves #1165 Creates an initial framework for conduit binary production. Introduces a data directory flag as well as init and shutdown (basic) functionality.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, I think it's a great start! Left some comments/questions--hope they help.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for a first pass. Let's continue the discussion around moving Genesis data to the Importer interface. Will also be good to get Will's opinion on this when he has time.
entry.Data["__type"] = f.Type | ||
entry.Data["_name"] = f.Name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why use 1 underscore in one key and 2 in the other?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's to force the order when outputting the json to the screen to be type then name. Like this:
{"__type":"Conduit","_name":"main","level":"info","msg":"Log level set to: warn","time":"2022-08-09T08:48:44-04:00"}
If one is looking at the logs manually it is a little easier to look at.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like you can provide a sorting function for this kind of thing without messing around with the names: https://pkg.go.dev/github.com/sirupsen/logrus#TextFormatter.SortingFunc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that is only for TextFormatter
not JSONFormatter
though. Do we care if we lose the JSON data format? I figured the structured logging would be useful for log aggregating tools.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to merge just to keep development moving forward. Left some comments for some things.
Also, we need to add conduit/pipeline_test.go
, conduit/config_test.go
, and probably processors/noop/noop_processor_test.go
(even though I know it does nothing).
rnd, _ := strconv.Atoi(path.Base(r.URL.Path)) | ||
blk := rpcs.EncodedBlockCert{Block: bookkeeping.Block{BlockHeader: bookkeeping.BlockHeader{Round: basics.Round(rnd)}}} | ||
var blockbytes []byte | ||
w.WriteHeader(http.StatusOK) | ||
response := struct { | ||
Block bookkeeping.Block `codec:"block"` | ||
}{ | ||
Block: blk.Block, | ||
} | ||
enc := codec.NewEncoderBytes(&blockbytes, protocol.CodecHandle) | ||
enc.Encode(response) | ||
w.Write(blockbytes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: I think we could simplify this with something like the following, but I haven't tested it.
rnd, _ := strconv.Atoi(path.Base(r.URL.Path)) | |
blk := rpcs.EncodedBlockCert{Block: bookkeeping.Block{BlockHeader: bookkeeping.BlockHeader{Round: basics.Round(rnd)}}} | |
var blockbytes []byte | |
w.WriteHeader(http.StatusOK) | |
response := struct { | |
Block bookkeeping.Block `codec:"block"` | |
}{ | |
Block: blk.Block, | |
} | |
enc := codec.NewEncoderBytes(&blockbytes, protocol.CodecHandle) | |
enc.Encode(response) | |
w.Write(blockbytes) | |
rnd, _ := strconv.Atoi(path.Base(r.URL.Path)) | |
blk := rpcs.EncodedBlockCert{Block: bookkeeping.Block{BlockHeader: bookkeeping.BlockHeader{Round: basics.Round(rnd)}}} | |
blockbytes := protocol.EncodeJSON(*blk) | |
w.WriteHeader(http.StatusOK) | |
w.Write(blockbytes) |
cmd/conduit/main.go
Outdated
Args: cobra.NoArgs, | ||
RunE: func(cmd *cobra.Command, args []string) error { | ||
|
||
// Tie cobra variables into local go-lang variables |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we sure we want to use environment variables? How does that work if there is more than one level in the YAML file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need to modify it anyway since the function uses the INDEXER
prefix which isn't what we want.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For this level, we would only be setting the data directory via environmental variables. All other (nested) configurations would be handled by the supplied config file conduit.yml
and wouldn't be affected by environmental variables.
importer *importers.Importer | ||
processors []*processors.Processor | ||
exporter *exporters.Exporter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would consider making all of these slices and generate errors when creating the things below if it isn't supported yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you explain more on that? So would it become:
importer []*importers.Importer
processors []*processors.Processor
exporter []*exporters.Exporter
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right. This way you can use the same pattern for initializing all of them, only the error handling would change. I don't think this was my most important bit of feedback, just something to consider.
Codecov Report
@@ Coverage Diff @@
## conduit #1167 +/- ##
==========================================
Coverage ? 58.84%
==========================================
Files ? 66
Lines ? 8904
Branches ? 0
==========================================
Hits ? 5240
Misses ? 3196
Partials ? 468 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
} | ||
|
||
// runConduitCmdWithConfig run the main logic with a supplied conduit config | ||
func runConduitCmdWithConfig(cfg *conduit.Config) error { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
having viper/pflag stuff in the run function might make it more difficult to use this in daemon later on. I think this can just accept a single dataDir string
parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we manually check CONDUIT_DATA_DIR
since it's the only special case?
} | ||
|
||
// Format allows this to be used as a logrus formatter | ||
func (f PluginLogFormatter) Format(entry *log.Entry) ([]byte, error) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I finally realized that you're implementing the Format
interface for logrus. Neat trick 👍
} | ||
|
||
// Valid validates pipeline config | ||
func (cfg *PipelineConfig) Valid() error { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be private
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might also be possible to set defaults and have some of it be automatic:
https://github.com/spf13/viper#establishing-defaults
}, | ||
) | ||
|
||
err := (*processor).Init(p.ctx, *p.initProvider, "") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this missing the configuration and logger?
var initProvider data.InitProvider = &PipelineInitProvider{ | ||
currentRound: ¤tRound, | ||
genesis: genesis, | ||
} | ||
|
||
p.initProvider = &initProvider |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
var initProvider data.InitProvider = &PipelineInitProvider{ | |
currentRound: ¤tRound, | |
genesis: genesis, | |
} | |
p.initProvider = &initProvider | |
p.initProvider = &PipelineInitProvider{ | |
currentRound: ¤tRound, | |
genesis: genesis, | |
} |
Also, consider renaming to pluginProvider
if we're using the same one for all plugin types.
) | ||
|
||
// TODO modify/fix ? | ||
jsonEncode := string(json.Encode(p.cfg.Importer.Config)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You could put a String()
function on the p.cfg.Importer.Config
object to streamline this a bit.
}, | ||
) | ||
|
||
err = (*p.exporter).Init("", p.logger) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one is also missing the config.
exporter *exporters.Exporter | ||
} | ||
|
||
func (p *pipelineImpl) Start() error { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
consider renaming to initialize
, since I'm sure there is going to be a lot more to do for Start
in the next iteration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is in a good spot, we can followup on any remaining tasks in later PRs.
Resolves #1165
Creates an initial framework for conduit binary production. Introduces
a data directory flag as well as init and shutdown (basic)
functionality.
Instructions:
make
in project root. Conduit binary should now be located incmd/conduit
mkdir -p /tmp/conduit_data_dir && touch /tmp/conduit_data_dir/conduit.yml
/tmp/conduit_data_dir/conduit.yml
:algod
instancecmd/conduit/conduit -d /tmp/conduit_data_dir/